Construction of Hyper-Relational Knowledge Graphs Using Pre-Trained Large Language Models
Extracting hyper-relations is crucial for constructing comprehensive knowledge graphs, but there are limited supervised methods available for this task. To address this gap, we introduce a zero-shot prompt-based method using OpenAI's GPT-3.5 model for extracting hyper-relational knowledge from text. Comparing our model with a baseline, we achieved promising results, with a recall of 0.77. Although our precision is currently lower, a detailed analysis of the model outputs has uncovered potential pathways for future research in this area.
PDF AbstractTasks
Datasets
Results from the Paper
Submit
results from this paper
to get state-of-the-art GitHub badges and help the
community compare results to other papers.
Methods
Adam •
Attention Dropout •
BPE •
Cosine Annealing •
Dense Connections •
Dropout •
Fixed Factorized Attention •
GELU •
GPT-3 •
Layer Normalization •
Linear Layer •
Linear Warmup With Cosine Annealing •
Multi-Head Attention •
Residual Connection •
Scaled Dot-Product Attention •
Softmax •
Strided Attention •
Weight Decay