This project demonstrates a framework for automatically populating OWL ontologies from unstructured natural language text using Large Language Models (LLMs). It builds on the ontology population methodology presented in the paper "Robot Situation and Task Awareness using Large Language Models and Ontologies" and is written entirely in Python.
The system enables intuitive and efficient human-robot interaction by transforming natural language scene descriptions into structured ontological knowledge. This knowledge is represented in OWL format which enables further reasoning and inference, which then can be used to enhance robotic planning.
The pipeline performs:
- Entity extraction
- Property (attribute) extraction
- Relation extraction
- Ontology instantiation via Owlready2
.
├── ontologies/
│ ├── modified_ontologies/
│ └── test_ontologies/
├── src/
│ ├── ontology_manager.py
│ └── ontopopulator.py
├── tests/
│ └── onto_instanciation/
├── Example_texts/
│ └── natural_text.txt
├── config.template.json
├── requirements.txt
├── setup.py
└── README.md
Make sure you have Python 3.8+ installed.
python -m venv venv
source venv/bin/activate # On Windows: venv\Scripts\Activate.ps1
pip install -r requirements.txt
All parameters are stored in a config.json
file. Here's a template:
{
"API_KEY": "your-openai-api-key",
"ontology": "your-ontology-name.owl",
"ontologies_directory_path": "path/to/ontologies/",
"text": "path/to/text.txt",
"SAVE_PATH": "path/to/save/instantiated-ontology/"
}
from src.ontopopulator import OntoPopulator
import json
# Load config
with open("config.json") as f:
config = json.load(f)
# Instantiate and run
api_key = config.get("API_KEY")
op = OntoPopulator(config.get("ontologies_directory_path"), config.get("ontology"), api_key=api_key)
results = op.populate_from_text(text_path=config.get("text"))
# Access outputs
print(results)
onto = results.get("ontology")
An example code to execute the program can be found at tests/onto_instanciation/
. And an example text for the population can be found at tests/onto_instanciation/Example_texts
Specify your license here (e.g., MIT, Apache 2.0). If you don’t have one, consider using Choose a License.
This project uses: