Giter VIP home page Giter VIP logo

deeponto's Introduction

deeponto

license docs pypi

A package for ontology engineering with deep learning.

News 📰

  • Hot fix to the openprompt issue by moving it to optional dependencies. (v0.9.1)
  • Minor feature enhancement; reorganise package layout. (v0.8.9)
  • Deploy deeponto.onto.taxonomy; add the structural reasoner type. (v0.8.8)
  • Deploy various new ontology processing functions especially for reasoning and verbalisation; update OAEI utitlities for evaluation. (v0.8.7)
  • Minor modifications of certain methods and set all utility methods to direct import. (v0.8.5)
  • Deploy OAEI utilities at deeponto.align.oaei for scripts at the sub-repository OAEI-Bio-ML as well as bug fixing. (v0.8.4)
  • Bug fixing for BERTMap (stuck at reasoning) and ontology alignment evaluation. (v0.8.3)
  • Deploy deeponto.onto.OntologyNormaliser and deeponto.onto.OntologyProjector (v0.8.0).
  • Upload Java dependencies directly and remove mowl from pip dependencies (v0.7.5).
  • Deploy the deeponto.subs.bertsubs and deeponto.onto.pruning modules (v0.7.0).
  • Deploy the deeponto.probe.ontolama and deeponto.onto.verbalisation modules (v0.6.0).
  • Rebuild the whole package based on the OWLAPI; remove owlready2 from the essential dependencies (from v0.5.x).

Check the complete changelog and FAQs. The FAQs page does not contain much information now but will be updated according to feedback.

About

$\textsf{DeepOnto}$ aims to provide building blocks for implementing deep learning models, constructing resources, and conducting evaluation for various ontology engineering purposes.

Installation

OWLAPI

$\textsf{DeepOnto}$ relies on OWLAPI version 4.5.22 (written in Java) for ontologies.

We follow what has been implemented in mOWL that uses JPype to bridge Python and Java Virtual Machine (JVM). Please check JPype's installation page for successful JVM initialisation.

Pytorch

$\textsf{DeepOnto}$ relies on Pytorch for deep learning framework.

We recommend installing Pytorch prior to installing $\textsf{DeepOnto}$ following the commands listed on the Pytorch webpage. Notice that users can choose either GPU (with CUDA) or CPU version of Pytorch.

In case the most recent Pytorch version causes any incompatibility issues, use the following command (with CUDA 11.6) known to work:

pip install torch torchvision torchaudio --extra-index-url https://download.pytorch.org/whl/cu116

Basic usage of $\textsf{DeepOnto}$ does not rely on GPUs, but for efficient deep learning model training, please make sure torch.cuda.is_available() returns True.

Install from PyPI

Other dependencies are specified in setup.cfg and requirements.txt which are supposed to be installed along with deeponto.

# requiring Python>=3.8
pip install deeponto

We have been informed that openprompt has a conflict with several other packages that can be hardly addressed on MacOS with M1, so we now set it as an optional dependency. However, it is main dependency of the OntoLAMA code at deeponto.complete.ontolama. To use OntoLAMA, please install openprompt separately, or use the following command to install $\textsf{DeepOnto}$:

pip install deeponto[ontolama]

Install from Git Repository

To install the latest, probably unreleased version of deeponto, you can directly install from the repository.

pip install git+https://github.com/KRR-Oxford/DeepOnto.git

Main Features

deeponto

Figure: Illustration of DeepOnto's architecture.

Ontology Processing

The base class of $\textsf{DeepOnto}$ is [Ontology][deeponto.onto.Ontology], which serves as the main entry point for introducing the OWLAPI's features, such as accessing ontology entities, querying for ancestor/descendent (and parent/child) concepts, deleting entities, modifying axioms, and retrieving annotations. See quick usage at load an ontology. Along with these basic functionalities, several essential sub-modules are built to enhance the core module, including the following:

  • Ontology Reasoning ([OntologyReasoner][deeponto.onto.OntologyReasoner]): Each instance of $\textsf{DeepOnto}$ has a reasoner as its attribute. It is used for conducting reasoning activities, such as obtaining inferred subsumers and subsumees, as well as checking entailment and consistency.

  • Ontology Pruning ([OntologyPruner][deeponto.onto.OntologyPruner]): This sub-module aims to incorporate pruning algorithms for extracting a sub-ontology from an input ontology. We currently implement the one proposed in [2], which introduces subsumption axioms between the asserted (atomic or complex) parents and children of the class targeted for removal.

  • Ontology Verbalisation ([OntologyVerbaliser][deeponto.onto.OntologyVerbaliser]): The recursive concept verbaliser proposed in [4] is implemented here, which can automatically transform a complex logical expression into a textual sentence based on entity names or labels available in the ontology. See verbalising ontology concepts.

  • Ontology Projection ([OntologyProjector][deeponto.onto.OntologyProjector]): The projection algorithm adopted in the OWL2Vec* ontology embeddings is implemented here, which is to transform an ontology's TBox into a set of RDF triples. The relevant code is modified from the mOWL library.

  • Ontology Normalisation ([OntologyNormaliser][deeponto.onto.OntologyNormaliser]): The implemented $\mathcal{EL}$ normalisation is also modified from the mOWL library, which is used to transform TBox axioms into normalised forms to support, e.g., geometric ontology embeddings.

  • Ontology Taxonomy ([OntologyTaxonomy][deeponto.onto.OntologyTaxonomy]): The taxonomy extracted from an ontology is a directed acyclic graph for the subsumption hierarchy, which is often used to support graph-based deep learning applications.

Tools and Resources

Individual tools and resources are implemented based on the core ontology processing module. Currently, $\textsf{DeepOnto}$ supports the following:

License

!!! license "License"

Copyright 2021-2023 Yuan He.
Copyright 2023 Yuan He, Jiaoyan Chen.
All rights reserved.

Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at *<http://www.apache.org/licenses/LICENSE-2.0>*

Unless required by applicable law or agreed to in writing, software
distributed under the License is distributed on an "AS IS" BASIS,
WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
See the License for the specific language governing permissions and
limitations under the License.

Citation

The preprint of our system paper for $\textsf{DeepOnto}$ is currently available at arxiv.

Yuan He, Jiaoyan Chen, Hang Dong, Ian Horrocks, Carlo Allocca, Taehun Kim, and Brahmananda Sapkota. DeepOnto: A Python Package for Ontology Engineering with Deep Learning. arXiv preprint arXiv:2307.03067 (2023).

Our paper has been accepted by the Semantic Web Journal.

!!! credit "Citation"

```
@article{he2023deeponto,
  title={DeepOnto: A Python Package for Ontology Engineering with Deep Learning},
  author={He, Yuan and Chen, Jiaoyan and Dong, Hang and Horrocks, Ian and Allocca, Carlo and Kim, Taehun and Sapkota, Brahmananda},
  journal={arXiv preprint arXiv:2307.03067},
  year={2023}
}
```

Relevant Publications

  • [1] Yuan He‚ Jiaoyan Chen‚ Denvar Antonyrajah and Ian Horrocks. BERTMap: A BERT−Based Ontology Alignment System. In Proceedings of 36th AAAI Conference on Artificial Intelligence (AAAI-2022). /arxiv/ /aaai/
  • [2] Yuan He‚ Jiaoyan Chen‚ Hang Dong, Ernesto Jiménez-Ruiz, Ali Hadian and Ian Horrocks. Machine Learning-Friendly Biomedical Datasets for Equivalence and Subsumption Ontology Matching. The 21st International Semantic Web Conference (ISWC-2022, Best Resource Paper Candidate). /arxiv/ /iswc/
  • [3] Jiaoyan Chen, Yuan He, Yuxia Geng, Ernesto Jiménez-Ruiz, Hang Dong and Ian Horrocks. Contextual Semantic Embeddings for Ontology Subsumption Prediction. World Wide Web Journal (WWWJ-2023). /arxiv/ /wwwj/
  • [4] Yuan He‚ Jiaoyan Chen, Ernesto Jiménez-Ruiz, Hang Dong and Ian Horrocks. Language Model Analysis for Ontology Subsumption Inference. Findings of the Association for Computational Linguistics (ACL-2023). /arxiv/ /acl/
  • [5] Yuan He, Jiaoyan Chen, Hang Dong, and Ian Horrocks. Exploring Large Language Models for Ontology Alignment. ISWC 2023 Posters and Demos: 22nd International Semantic Web Conference. /arxiv/ /iswc/

Please report any bugs or queries by raising a GitHub issue or sending emails to the maintainers (Yuan He or Jiaoyan Chen) through:

[email protected]

deeponto's People

Contributors

chenjiaoyan avatar lawhy avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar

deeponto's Issues

BERTMapPipeline crashes on custom ontologies

Describe the bug
When attempting to use BERTMapPipeline on publicly available ontologies, it fails with the following error:

ValueError: evaluation strategy steps requires either non-zero --eval_steps or --logging_steps

Based on #18 , I understand that this means deeponto failed to generate training data, however both ontologies are pretty large and have rdfs:label annotations. Any advice welcome

To Reproduce
Steps to reproduce the behavior:

  1. Download schema.org ontology (or owl variant) on https://schema.org/docs/developers.html
  2. Download sphn schema on https://www.biomedit.ch/rdf/sphn-schema/sphn
  3. Get default configuration file based on docs
  4. Run the example from usage page.
from deeponto.onto import Ontology
from deeponto.align.bertmap import BERTMapPipeline

config = BERTMapPipeline.load_bertmap_config('config.yaml')
schemaorg = Ontology('schemaorg.ttl')
sphn = Ontology('sphn_schema.ttl')
BERTMapPipeline(schemaorg, sphn, config)

Expected behavior
A mapping is generated

Desktop (please complete the following information):

  • OS: Ubuntu 22.04

Small typo in the documentation.

While exploring the documentation I just came across a small typo in the example code snippets. There is a quotation mark missing in the a line of code there (see the link below). Nothing concerning but I just wanted to let you know. It may cause some unexpected error for those who copy-paste the code :)

onto.get_subsumption_axioms(entity_type="Classes) --> onto.get_subsumption_axioms(entity_type="Classes")

https://krr-oxford.github.io/DeepOnto/verbaliser/#:~:text=(entity_type%3D-,%22Classes),-%23%20verbalise%20the%20first

Generating results for EditSim

Hi, I am not able to generate the exact H@1 and MRR for EditSim for the FMA SNOMED task as reported in Table 4 in https://arxiv.org/pdf/2205.03447.pdf.

This is the command used:

python om_eval.py --saved_path './om_results' --pred_path './onto_match_experiment2/edit_sim/global_match/src2tgt' --ref_anchor_path 'data/equiv_match/refs/snomed2fma.body/unsupervised/src2tgt.rank/for_eval' --hits_at 1

These are the generated numbers: H@1: .841 and MRR: .89
Reported nos. in the paper: H@1: 869 and MRR: .895

I am not sure why the numbers are not consistent.
Is there anything that needs to be modified in the code to get the reported numbers?

division by zero error in AnnotationThesaurus

Describe the bug
While running BERTMap I'm receiving an error "ZeroDivisionError: division by zero"

To Reproduce
Launch BERTMap with these input files:
- configuration: bertmap.yaml
- source ontology: ontology-network.ttl
- target ontology: music.owl

Expected behavior
Mapping search between the ontologies should work normally

Actual output

[Time: 00:18:47] - [PID: 172] - [Model: bertmap] 
Load the following configurations:
{
    "model": "bertmap",
    "output_path": "/content",
    "annotation_property_iris": [
        "http://www.w3.org/2000/01/rdf-schema#label",
        "http://www.geneontology.org/formats/oboInOwl#hasSynonym",
        "http://www.geneontology.org/formats/oboInOwl#hasExactSynonym",
        "http://www.w3.org/2004/02/skos/core#exactMatch",
        "http://www.ebi.ac.uk/efo/alternative_term",
        "http://www.orpha.net/ORDO/Orphanet_#symbol",
        "http://purl.org/sig/ont/fma/synonym",
        "http://www.w3.org/2004/02/skos/core#prefLabel",
        "http://www.w3.org/2004/02/skos/core#altLabel",
        "http://ncicb.nci.nih.gov/xml/owl/EVS/Thesaurus.owl#P108",
        "http://ncicb.nci.nih.gov/xml/owl/EVS/Thesaurus.owl#P90"
    ],
    "known_mappings": null,
    "auxiliary_ontos": [],
    "bert": {
        "pretrained_path": "bert-base-uncased",
        "max_length_for_input": 128,
        "num_epochs_for_training": 3.0,
        "batch_size_for_training": 16,
        "batch_size_for_prediction": 128,
        "resume_training": null
    },
    "global_matching": {
        "enabled": true,
        "num_raw_candidates": 200,
        "num_best_predictions": 10,
        "mapping_extension_threshold": 0.8,
        "mapping_filtered_threshold": 0.9
    }
}
[Time: 00:00:00] - [PID: 172] - [Model: bertmap] 
Load the following configurations:
{
    "model": "bertmap",
    "output_path": "/content",
    "annotation_property_iris": [
        "http://www.w3.org/2000/01/rdf-schema#label",
        "http://www.geneontology.org/formats/oboInOwl#hasSynonym",
        "http://www.geneontology.org/formats/oboInOwl#hasExactSynonym",
        "http://www.w3.org/2004/02/skos/core#exactMatch",
        "http://www.ebi.ac.uk/efo/alternative_term",
        "http://www.orpha.net/ORDO/Orphanet_#symbol",
        "http://purl.org/sig/ont/fma/synonym",
        "http://www.w3.org/2004/02/skos/core#prefLabel",
        "http://www.w3.org/2004/02/skos/core#altLabel",
        "http://ncicb.nci.nih.gov/xml/owl/EVS/Thesaurus.owl#P108",
        "http://ncicb.nci.nih.gov/xml/owl/EVS/Thesaurus.owl#P90"
    ],
    "known_mappings": null,
    "auxiliary_ontos": [],
    "bert": {
        "pretrained_path": "bert-base-uncased",
        "max_length_for_input": 128,
        "num_epochs_for_training": 3.0,
        "batch_size_for_training": 16,
        "batch_size_for_prediction": 128,
        "resume_training": null
    },
    "global_matching": {
        "enabled": true,
        "num_raw_candidates": 200,
        "num_best_predictions": 10,
        "mapping_extension_threshold": 0.8,
        "mapping_filtered_threshold": 0.9
    }
}
INFO:bertmap:Load the following configurations:
{
    "model": "bertmap",
    "output_path": "/content",
    "annotation_property_iris": [
        "http://www.w3.org/2000/01/rdf-schema#label",
        "http://www.geneontology.org/formats/oboInOwl#hasSynonym",
        "http://www.geneontology.org/formats/oboInOwl#hasExactSynonym",
        "http://www.w3.org/2004/02/skos/core#exactMatch",
        "http://www.ebi.ac.uk/efo/alternative_term",
        "http://www.orpha.net/ORDO/Orphanet_#symbol",
        "http://purl.org/sig/ont/fma/synonym",
        "http://www.w3.org/2004/02/skos/core#prefLabel",
        "http://www.w3.org/2004/02/skos/core#altLabel",
        "http://ncicb.nci.nih.gov/xml/owl/EVS/Thesaurus.owl#P108",
        "http://ncicb.nci.nih.gov/xml/owl/EVS/Thesaurus.owl#P90"
    ],
    "known_mappings": null,
    "auxiliary_ontos": [],
    "bert": {
        "pretrained_path": "bert-base-uncased",
        "max_length_for_input": 128,
        "num_epochs_for_training": 3.0,
        "batch_size_for_training": 16,
        "batch_size_for_prediction": 128,
        "resume_training": null
    },
    "global_matching": {
        "enabled": true,
        "num_raw_candidates": 200,
        "num_best_predictions": 10,
        "mapping_extension_threshold": 0.8,
        "mapping_filtered_threshold": 0.9
    }
}
[Time: 00:18:47] - [PID: 172] - [Model: bertmap] 
Save the configuration file at /content/bertmap/config.yaml.
[Time: 00:00:00] - [PID: 172] - [Model: bertmap] 
Save the configuration file at /content/bertmap/config.yaml.
INFO:bertmap:Save the configuration file at /content/bertmap/config.yaml.
[Time: 00:18:47] - [PID: 172] - [Model: bertmap] 
Construct new text semantics corpora and save at /content/bertmap/data/text-semantics.corpora.json.
[Time: 00:00:00] - [PID: 172] - [Model: bertmap] 
Construct new text semantics corpora and save at /content/bertmap/data/text-semantics.corpora.json.
INFO:bertmap:Construct new text semantics corpora and save at /content/bertmap/data/text-semantics.corpora.json.
---------------------------------------------------------------------------
ZeroDivisionError                         Traceback (most recent call last)
[<ipython-input-12-a888744a31b2>](https://localhost:8080/#) in <cell line: 1>()
----> 1 bertmap = BERTMapPipeline(src_onto, tgt_onto, config)

6 frames
[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/pipeline.py](https://localhost:8080/#) in __init__(self, src_onto, tgt_onto, config)
    119         # load or construct the corpora
    120         self.corpora_path = os.path.join(self.data_path, "text-semantics.corpora.json")
--> 121         self.corpora = self.load_text_semantics_corpora()
    122 
    123         # load or construct fine-tune data

[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/pipeline.py](https://localhost:8080/#) in load_text_semantics_corpora(self)
    251                 corpora.save(self.data_path)
    252 
--> 253             return self.load_or_construct(self.corpora_path, data_name, construct)
    254 
    255         self.logger.info(f"No training needed; skip the construction of {data_name}.")

[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/pipeline.py](https://localhost:8080/#) in load_or_construct(self, data_file, data_name, construct_func, *args, **kwargs)
    227         else:
    228             self.logger.info(f"Construct new {data_name} and save at {data_file}.")
--> 229             construct_func(*args, **kwargs)
    230         # load the data file that is supposed to be saved locally
    231         return FileUtils.load_file(data_file)

[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/pipeline.py](https://localhost:8080/#) in construct()
    241 
    242             def construct():
--> 243                 corpora = TextSemanticsCorpora(
    244                     src_onto=self.src_onto,
    245                     tgt_onto=self.tgt_onto,

[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/text_semantics.py](https://localhost:8080/#) in __init__(self, src_onto, tgt_onto, annotation_property_iris, class_mappings, auxiliary_ontos)
    517         # build intra-ontology corpora
    518         # negative sample ratios are by default
--> 519         self.intra_src_onto_corpus = IntraOntologyTextSemanticsCorpus(src_onto, annotation_property_iris)
    520         self.add_samples_from_sub_corpus(self.intra_src_onto_corpus)
    521         self.intra_tgt_onto_corpus = IntraOntologyTextSemanticsCorpus(tgt_onto, annotation_property_iris)

[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/text_semantics.py](https://localhost:8080/#) in __init__(self, onto, annotation_property_iris, soft_negative_ratio, hard_negative_ratio)
    310         self.onto = onto
    311         # $\textsf{BERTMap}$ does not apply synonym transitivity
--> 312         self.thesaurus = AnnotationThesaurus(onto, annotation_property_iris, apply_transitivity=False)
    313 
    314         self.synonyms = self.thesaurus.synonym_sampling()

[/usr/local/lib/python3.10/dist-packages/deeponto/align/bertmap/text_semantics.py](https://localhost:8080/#) in __init__(self, onto, annotation_property_iris, apply_transitivity)
     74         self.annotation_property_iris = iris
     75         total_number_of_annotations = sum([len(v) for v in self.annotation_index.values()])
---> 76         self.average_number_of_annotations_per_class = total_number_of_annotations / len(self.annotation_index)
     77 
     78         # synonym groups

ZeroDivisionError: division by zero

Following the stack trace I see that the code uses the length of self.annotation_index as denominator, but apparently this length is zero. This is a dictionary built by Ontology::build_annotation_index() based on annotation_property_iris, which as can be seen above is correctly populated and not empty. So I suspect the bug is located somewhere in this function, but I wasn't able to understand exactly where.

Desktop (please complete the following information):

  • ipynb notebook in Google Colab
  • Version 0.8.4

Tokenizer error "list index out of range" during mapping extension

Describe the bug
Under some circumstances dureing the mapping extensions stage the tokenizer throws the error IndexError: list index out of range.
The error originates at bert_classifier.py line 185.
This is the same error and same location inside the tokenizer of huggingface/tokenizers#993 , which was caused by the data passed to the tokenizer.

To Reproduce
I have reproduced this error with these settings:

Logs & stack trace max_length_for_input batch_size_for_training Source ontology Target ontology
link 256 16 music-representation.owl musicClasses.owl @ 2ebb641
link 128 8 core.owl musicClasses.owl @ ebc2d09

Expected behavior
The stage and the pipeline should complete successfully

Platform:

  • OS: python notebook on Google Colab
  • Python 3.10
  • Transformers 4.30.2
  • DeepOnto 0.8.3

installation: cannot import java related packages

Describe the bug
After following the installation instructions (via conda) of deeponto+pytorch+jpype, I cannot import the Ontology class. It seems that all org.* packages cannot be imported.
Setting and exporting JAVA_HOME manually does not seem to fix the issue.

It fails with the following error (full traceback below):

ModuleNotFoundError: No module named 'org.slf4j'

Any advice would be appreciated.

NOTICE: I am not familiar with the combination of python x java

To Reproduce
Steps to reproduce the behavior:

  1. conda create -n deeponto python=3.10
  2. conda activate deeponto
  3. conda install pytorch torchvision torchaudio cpuonly -c pytorch -c conda-forge
  4. conda install -c conda-forge jpype1
  5. in python, type from deeponto.onto import Ontology

Expected behavior
The Ontology is imported.

Screenshots
Traceback below:

ipython
Python 3.10.14 | packaged by conda-forge | (main, Mar 20 2024, 12:45:18) [GCC 12.3.0]
Type 'copyright', 'credits' or 'license' for more information
IPython 8.24.0 -- An enhanced Interactive Python. Type '?' for help.

In [1]: from deeponto.onto import Ontology
Please enter the maximum memory located to JVM [8g]:

INFO:deeponto:8g maximum memory allocated to JVM.
INFO:deeponto:JVM started successfully.
---------------------------------------------------------------------------
ModuleNotFoundError                       Traceback (most recent call last)
Cell In[1], line 1
----> 1 from deeponto.onto import Ontology

File ~/.local/share/micromamba/envs/deeponto/lib/python3.10/site-packages/deeponto/onto/__init__.py:14
      1 # Copyright 2021 Yuan He. All rights reserved.
      2
      3 # Licensed under the Apache License, Version 2.0 (the "License");
   (...)
     12 # See the License for the specific language governing permissions and
     13 # limitations under the License.
---> 14 from .ontology import Ontology, OntologyReasoner
     15 from .pruning import OntologyPruner
     16 from .verbalisation import OntologyVerbaliser, OntologySyntaxParser

File ~/.local/share/micromamba/envs/deeponto/lib/python3.10/site-packages/deeponto/onto/ontology.py:47
     45 from java.io import File  # type: ignore
     46 from java.lang import Runtime, System  # type: ignore
---> 47 from org.slf4j.impl import SimpleLogger  # type: ignore
     48 System.setProperty(SimpleLogger.DEFAULT_LOG_LEVEL_KEY, "warn")  # set slf4j default logging level to warning
     49 from org.semanticweb.owlapi.apibinding import OWLManager  # type: ignore

ModuleNotFoundError: No module named 'org.slf4j'

Desktop (please complete the following information):

  • OS: Ubuntu 22.04

Consistency checking

Dear all,

thank you for DeepOnto.

I was wondering whether there is an example code for consistency checking, e.g.

from deeponto.onto import Ontology
onto = Ontology("path_to_ontology.owl", "hermit")
assert onto.consistent()

BERTMap Stuck at Mapping Extension

Describe the bug
The BERTMap model got stuck at the mapping extension phase.

To Reproduce
Steps to reproduce the behavior:
Run BERTMap on SNOMED-FMA (Body) task.

Verbaliser throws KeyError

Bug Description
I'm trying to verbalise a class expression. The code I'm executing is as follows:

from deeponto.onto import Ontology, OntologyVerbaliser, OntologySyntaxParser

onto = Ontology("ontology.owl")
verbaliser = OntologyVerbaliser(onto)
complex_concepts = list(onto.get_asserted_complex_classes())

v_concept = verbaliser.verbalise_class_expression(complex_concepts[0])

Where ontology.owl is a simple ontology of RDF/XML syntax that contains an atomic concept, a datatype property and a complex concept. The whole ontology provided in Additional Context

I get the following error:

Traceback (most recent call last):
  File "/home/pg-xai2/sampling/examples/prova_deeponto.py", line 42, in <module>
    v_concept = verbaliser.verbalise_class_expression(complex_concepts[0])
  File "/home/pg-xai2/.conda/envs/ontolearn/lib/python3.9/site-packages/deeponto/onto/verbalisation.py", line 227, in verbalise_class_expression
    return self._verbalise_junction(parsed_class_expression)
  File "/home/pg-xai2/.conda/envs/ontolearn/lib/python3.9/site-packages/deeponto/onto/verbalisation.py", line 334, in _verbalise_junction
    other_children.append(self.verbalise_class_expression(child))
  File "/home/pg-xai2/.conda/envs/ontolearn/lib/python3.9/site-packages/deeponto/onto/verbalisation.py", line 214, in verbalise_class_expression
    return self._verbalise_iri(parsed_class_expression)
  File "/home/pg-xai2/.conda/envs/ontolearn/lib/python3.9/site-packages/deeponto/onto/verbalisation.py", line 254, in _verbalise_iri
    verbal = self.vocab[iri] if not self.keep_iri else iri_node.text
KeyError: 'http://dl-learner.org/mutagenesis#Compound'

This is the printed complex concept (maybe you can just try to manually construct this concept and test it out):

ObjectIntersectionOf(<http://dl-learner.org/mutagenesis#Compound> DataSomeValuesFrom(<http://dl-learner.org/mutagenesis#act> DatatypeRestriction(xsd:decimal facetRestriction(minInclusive "0.04"^^xsd:decimal))))

To Reproduce

Execute the code described above using the given ontology.

Additional context

OS: Linux

content of ontology.owl:

<?xml version="1.0"?>
<rdf:RDF xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#"
         xmlns:xsd="http://www.w3.org/2001/XMLSchema#"
         xmlns:rdfs="http://www.w3.org/2000/01/rdf-schema#"
         xmlns:owl="http://www.w3.org/2002/07/owl#"
         xml:base="http://dl-learner.org/mutagenesis"
         xmlns="http://dl-learner.org/mutagenesis#">

<owl:Ontology rdf:about="http://dl-learner.org/mutagenesis"/>

<owl:DatatypeProperty rdf:about="#act">
  <rdfs:domain rdf:resource="#Compound"/>
  <rdfs:range rdf:resource="http://www.w3.org/2001/XMLSchema#double"/>
</owl:DatatypeProperty>

<owl:Class rdf:about="#Compound"/>

<owl:Class rdf:about="http://dl-learner.org/Pred_1">
  <rdfs:subClassOf rdf:resource="http://www.w3.org/2002/07/owl#Thing"/>
  <owl:equivalentClass>
    <owl:Class>
      <owl:intersectionOf rdf:parseType="Collection">
        <rdf:Description rdf:about="#Compound"/>
        <owl:Restriction>
          <owl:onProperty rdf:resource="#act"/>
          <owl:someValuesFrom>
            <rdfs:Datatype>
              <owl:onDatatype rdf:resource="http://www.w3.org/2001/XMLSchema#decimal"/>
              <owl:withRestrictions>
                <rdf:Description>
                  <rdf:first>
                    <rdf:Description>
                      <xsd:minInclusive rdf:datatype="http://www.w3.org/2001/XMLSchema#decimal">0.04</xsd:minInclusive>
                    </rdf:Description>
                  </rdf:first>
                  <rdf:rest rdf:resource="http://www.w3.org/1999/02/22-rdf-syntax-ns#nil"/>
                </rdf:Description>
              </owl:withRestrictions>
            </rdfs:Datatype>
          </owl:someValuesFrom>
        </owl:Restriction>
      </owl:intersectionOf>
    </owl:Class>
  </owl:equivalentClass>
</owl:Class>


</rdf:RDF>

I tried other ontologies as well including Carcinogenesis and the whole Mutagenesis which you can find here. Since they do not contain complex concepts I tried to verbalize a sub class axioms like following:

# get subsumption axioms from the ontology
subsumption_axioms = onto.get_subsumption_axioms(entity_type="Classes")

# verbalise the first subsumption axiom
v_sub, v_super = verbaliser.verbalise_class_subsumption_axiom(subsumption_axioms[0])

The same kind of error as mentioned earlier occurred.


Please fix the version of transformer library

I have tried to run BERTMap, but got the following error:

RuntimeError: Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cpu! (when checking argument for argument index in method wrapper__index_select)

In fact, it is a bug that was introduced in Transformer 4.12.3 and has been fixed in 4.13.0. For short, the output of Tokenizer is BatchEncoding, but the Trainer only transfers Union[torch.Tensor, Tuple, List, Dictionary] to GPU.
(Please refer to this link for more details When running the Trainer cell, it found two devices (cuda:0 and CPU))

I think this bug is introduced in this commit 086a25cae945d496765cbbb09b36f9780d676ac7. Please consider fixing the version of Transformer.

REST API with Dockerfile

In addition to a library, consider also creating a Dockerfile which uses FastAPI to serve web APIs that can be used. For instance, instead of having to import the library, I can deploy a docker container and call the APIs for which I will provide all the necessary inputs.

BERTMap: ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

I've been trying to use BERTMap and I get this error:

C:\Users\annas\AppData\Local\Programs\Python\Python312\python.exe C:/Users/annas/PycharmProjects/eswc-work/eval-deeponto.py
Traceback (most recent call last):
  File "C:\Users\annas\PycharmProjects\eswc-work\eval-deeponto.py", line 1, in <module>
    from deeponto.onto import Ontology
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\deeponto\onto\__init__.py", line 14, in <module>
    from .ontology import Ontology, OntologyReasoner
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\deeponto\onto\ontology.py", line 27, in <module>
    from deeponto.utils import (
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\deeponto\utils\__init__.py", line 17, in <module>
    from .text_utils import *
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\deeponto\utils\text_utils.py", line 23, in <module>
    import spacy
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\spacy\__init__.py", line 6, in <module>
    from .errors import setup_default_warnings
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\spacy\errors.py", line 3, in <module>
    from .compat import Literal
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\spacy\compat.py", line 39, in <module>
    from thinc.api import Optimizer  # noqa: F401
    ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\thinc\api.py", line 1, in <module>
    from .backends import (
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\thinc\backends\__init__.py", line 17, in <module>
    from .cupy_ops import CupyOps
  File "C:\Users\annas\AppData\Local\Programs\Python\Python312\Lib\site-packages\thinc\backends\cupy_ops.py", line 16, in <module>
    from .numpy_ops import NumpyOps
  File "thinc\backends\numpy_ops.pyx", line 1, in init thinc.backends.numpy_ops
ValueError: numpy.dtype size changed, may indicate binary incompatibility. Expected 96 from C header, got 88 from PyObject

Process finished with exit code 1

My numpy version is already upgraded, what could it be?

Repeated error message regarding BERT's configuration while using BertMap with custom ontologies

This is Shriram and I mailed you recently regarding my interest in making use of DeepOnto, I am currently using 2 different autonomous vehicles ontology and am unable to run the BertMap model due to "ValueError: evaluation strategy steps requires either non-zero --eval_steps or --logging_steps". I am unaware as to where this error is arising from.

/usr/local/lib/python3.10/dist-packages/transformers/training_args.py in post_init(self)
1301 self.eval_steps = self.logging_steps
1302 else:
-> 1303 raise ValueError(
1304 f"evaluation strategy {self.evaluation_strategy} requires either non-zero --eval_steps or"
1305 " --logging_steps"

ValueError: evaluation strategy steps requires either non-zero --eval_steps or --logging_steps

this is the entire error I am getting,
Could the number of instances in my ontology be any reason for this error? I even tried multiple value changes to my config yaml file, none of them work. Kindly help me with the same.

Thanks in advance!

Disable INFO level message of ELK reasoner.

Is your feature request related to a problem? Please describe.
Using the ELK reasoner prints a lot of console message.

Describe the solution you'd like
Disable INFO level message.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.