← USPTO Patent Grants

Context-aware entity linking for knowledge graphs to support decision making

Grant US12596933B2 Kind: B2 Apr 07, 2026

Assignee

NEC CORPORATION

Inventors

David Friede, Kiril Gashteovski

Abstract

A machine learning model includes a context transformer and a decision head. The context transformer is a neural network of self-attention layers. The model makes a link prediction for a query embedding. Input embeddings are received at inputs of the context transformer. The input embeddings have: a query embedding set, the query embedding set comprising a subject embedding, object embedding, and relation embedding, one of the subject embedding, the object embedding, and the relation embedding being the query embedding; and knowledge graph embeddings. A first self-attention layer generates an attention score for each of the input embeddings. A final layer of the context transformer generates the link prediction for the query embedding and an output associated with each of the input embeddings. The decision head combines the attention score and the output for each of the input embeddings to determine a significance score for each of the input embeddings.

CPC Classifications

G06N 5/022 G06N 3/08

Filing Date

2021-08-03

Application No.

17392319

Claims

17