Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions docs/datadoc/customisation.md
Original file line number Diff line number Diff line change
Expand Up @@ -177,9 +177,9 @@ You can save this context to a triplestore with
>>> ts = Triplestore("rdflib")
>>> save_datadoc( # doctest: +ELLIPSIS
... ts,
... "https://raw.githubusercontent.com/EMMC-ASBL/tripper/refs/heads/master/tests/input/custom_context.yaml",
... "https://raw.githubusercontent.com/EMMC-ASBL/tripper/refs/heads/master/tests/input/custom_context.yaml",
... )
AttrDict(...)
{'@context': [...]}

```

Expand Down
12 changes: 6 additions & 6 deletions docs/datadoc/documenting-a-resource.md
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ We therefore have to define them explicitly

```python
>>> prefixes = {
... "sem": "https://w3id.com/emmo/domain/sem/0.1#",
... "sem": "https://w3id.org/emmo/domain/sem/0.1#",
... "kb": "http://example.com/kb/"
... }

Expand All @@ -60,7 +60,7 @@ We therefore have to define them explicitly
{
"@context": "https://raw.githubusercontent.com/EMMC-ASBL/tripper/refs/heads/master/tripper/context/0.3/context.json",
"@id": "http://example.com/kb/image1",
"@type": "https://w3id.com/emmo/domain/sem/0.1#SEMImage",
"@type": "https://w3id.org/emmo/domain/sem/0.1#SEMImage",
"creator": {
"@type": [
"http://xmlns.com/foaf/0.1/Agent",
Expand Down Expand Up @@ -104,7 +104,7 @@ You can use `ts.serialize()` to list the content of the triplestore (defaults to
@prefix foaf: <http://xmlns.com/foaf/0.1/> .
@prefix kb: <http://example.com/kb/> .
@prefix rdf: <http://www.w3.org/1999/02/22-rdf-syntax-ns#> .
@prefix sem: <https://w3id.com/emmo/domain/sem/0.1#> .
@prefix sem: <https://w3id.org/emmo/domain/sem/0.1#> .
@prefix xsd: <http://www.w3.org/2001/XMLSchema#> .

kb:image1 a sem:SEMImage ;
Expand Down Expand Up @@ -145,11 +145,11 @@ Saving [semdata.yaml] to a triplestore can e.g. be done with

```python
>>> from tripper.datadoc import save_datadoc
>>> save_datadoc( # doctest: +ELLIPSIS
>>> save_datadoc( # doctest: +ELLIPSIS, +NORMALIZE_WHITESPACE
... ts,
... "https://raw.githubusercontent.com/EMMC-ASBL/tripper/refs/heads/master/tests/input/semdata.yaml"
... )
AttrDict(...)
{'@graph': [...], ...}

```

Expand Down Expand Up @@ -186,7 +186,7 @@ The below example shows how to save all datasets listed in the CSV file [semdata
>>> td = TableDoc.parse_csv(
... "https://raw.githubusercontent.com/EMMC-ASBL/tripper/refs/heads/master/tests/input/semdata.csv",
... prefixes={
... "sem": "https://w3id.com/emmo/domain/sem/0.1#",
... "sem": "https://w3id.org/emmo/domain/sem/0.1#",
... "semdata": "https://he-matchmaker.eu/data/sem/",
... "sample": "https://he-matchmaker.eu/sample/",
... "mat": "https://he-matchmaker.eu/material/",
Expand Down
6 changes: 5 additions & 1 deletion docs/session.md
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,11 @@ you can now do:
```python
>>> from tripper import Literal, Session

>>> session = Session()
# Normally you will call Session with no arguments
>>> session = Session() # doctest: +SKIP

# ...but it is also possible to specify the config file explicitly
>>> session = Session("tests/input/session.yaml")
>>> ts = session.get_triplestore("FusekiTest")
>>> EX = ts.bind("ex", "http://example.com#")

Expand Down
9 changes: 3 additions & 6 deletions docs/tutorial.md
Original file line number Diff line number Diff line change
Expand Up @@ -137,8 +137,7 @@ The `check=True` enables checking for existing IRIs.
>>> EMMO.invalid_name # doctest: +ELLIPSIS
Traceback (most recent call last):
...
tripper.errors.NoSuchIRIError: https://w3id.org/emmo#invalid_name
Maybe you have to remove the cache file: ...
tripper.errors.NoSuchIRIError: https://w3id.org/emmo#invalid_name...

```

Expand Down Expand Up @@ -167,8 +166,7 @@ For example:
>>> FOOD.Hamburger # doctest: +ELLIPSIS
Traceback (most recent call last):
...
tripper.errors.NoSuchIRIError: http://onto-ns.com/ontologies/examples/food#Hamburger
...
tripper.errors.NoSuchIRIError: http://onto-ns.com/ontologies/examples/food#Hamburger...

# Add Hamburger to known labels
>>> extend_namespace(FOOD, {"Hamburger": FOOD + "Hamburger"})
Expand All @@ -179,8 +177,7 @@ True
>>> FOOD.Fish # doctest: +ELLIPSIS
Traceback (most recent call last):
...
tripper.errors.NoSuchIRIError: http://onto-ns.com/ontologies/examples/food#Fish
...
tripper.errors.NoSuchIRIError: http://onto-ns.com/ontologies/examples/food#Fish...

# Extend FOOD from an online turtle file
>>> extend_namespace(
Expand Down
4 changes: 2 additions & 2 deletions docs/units/units.md
Original file line number Diff line number Diff line change
Expand Up @@ -49,8 +49,8 @@ Item access creates a subclass of a Pint quantity representation (see [Working w
>>> ureg["Pa"]
<Quantity(1, 'Pascal')>

>>> ureg["N⋅m²"]
<Quantity(1, 'Newton * Metre ** 2')>
>>> ureg["A/m²"]
<Quantity(1.0, 'Ampere / Metre ** 2')>

```

Expand Down
4 changes: 2 additions & 2 deletions tests/input/semdata-context.json
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
{
"@context": {
"sem": "https://w3id.com/emmo/domain/sem/0.1#",
"sem": "https://w3id.org/emmo/domain/sem/0.1#",
"semdata": "https://he-matchmaker.eu/data/sem/",
"sample": "https://he-matchmaker.eu/sample/",
"mat": "https://he-matchmaker.eu/material/",
"dm": "http://onto-ns.com/meta/characterisation/0.1/SEMImage#",
"parser": "http://sintef.no/dlite/parser#",
"gen": "http://sintef.no/dlite/generator#",
"micro": "https://w3id.com/emmo/domain/microstructure/0.3#"
"micro": "https://w3id.org/emmo/domain/microstructure/0.3#"
},

"fromSample": {
Expand Down
8 changes: 7 additions & 1 deletion tests/test_markdown_doctest.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,4 +30,10 @@ def test_markdown_doctest():
path = os.path.join(dirpath, filename)
relpath = os.path.relpath(dirpath, str(rootdir))
print(f"-- doctest {relpath}/{filename}")
doctest.testfile(str(path), module_relative=False)
result = doctest.testfile(str(path), module_relative=False)
if result.failed:
raise RuntimeError(
f"failing doctest: {relpath}/{filename}\n"
"To debug, please run:\n\n"
f" python -m doctest {relpath}/{filename}\n"
)
16 changes: 1 addition & 15 deletions tripper/datadoc/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -236,11 +236,6 @@ def addsuperclasses(d, cls):
logging.info(
f"Class not in keywords: {', '.join(missing)}",
)
# warnings.warn(
# "No superclass info. Not in keywords file: "
# + ", ".join(missing),
# category=MissingKeywordsClassWarning,
# )

if isinstance(descr, str):
return descr
Expand Down Expand Up @@ -276,10 +271,6 @@ def addsuperclasses(d, cls):
if not k.startswith("@") and k not in keywords:
# pylint: disable=logging-fstring-interpolation
logging.info(f"Property not in keywords: {k}")
# warnings.warn(
# f"No range info. Not in keywords file: {k}",
# UnknownKeywordWarning,
# )
if k in ("@context", "@id", "@type"):
pass
elif k == "@graph":
Expand Down Expand Up @@ -367,11 +358,6 @@ def save_dict(
context.sync_prefixes(ts)

add(d, "@context", context.get_context_dict())
# if "@context" in d:
# context.add_context(d["@context"])
# d = jsonld.compact(d, context.get_context_dict())
# else:
# d["@context"] = context.get_context_dict()

# Validate
# TODO: reenable validation
Expand Down Expand Up @@ -1252,7 +1238,7 @@ def delete(
flags: "Optional[str]" = None,
keywords: "Optional[Keywords]" = None,
) -> None:
"""Delete matching resources. See `search_iris()` for a description of arguments."""
"""Delete matching resources. See `search()` for argument descriptions."""
iris = search_iris(
ts=ts,
type=type,
Expand Down
23 changes: 23 additions & 0 deletions tripper/units/units.py
Original file line number Diff line number Diff line change
Expand Up @@ -1030,6 +1030,29 @@ def asint(x):
def to_ontology_units(self) -> "Quantity":
"""Return new quantity rescale to a unit with the same
dimensionality that exists in the ontology.

Notes:
This function tries to select the "simplest" unit among all the
units with compatible physical dimensionality in the ontology.

This is done according to the following heuristics:

1. Find units with compatible physical dimensionality in the
ontology.
2. Among these units, select the unit that minimises the absolute
value of the sum of the powers of each unit component.

Example: among the units

Pa = Pa^1 -> sum=1
J/m^3 = J^1/m^3 -> sum=1+3=4
N/m^2 = N^1/m^2 -> sum=1+2=3

Pa will be selected.
3. If two units have the same sum, the unit that minimises
`log10(magnitude/5)` is selected, where `magnitude` is the
magnitude of the quantity when expressed in SI base units.

"""
# pylint: disable=protected-access
ureg = self._REGISTRY
Expand Down