RIR: Calibration for Knowledge Graph Models

Abstract:
A knowledge graph consists of entities and relationships between them. Knowledge graph embedding models are neural network models that learn embeddings for every entity and relationship in the graph. These embeddings are learnt to solve a particular problem such as link prediction. Calibration is a post-processing technique that aligns the confidence scores predicted by a model to match the distribution of the actual data. For instance, if the ground truth consists of 50% positive triples, a perfectly calibrated model would have a 0.5 probability of predicting a positive triple. This alignment makes models more reliable as it increases confidence of accurate predictions and decreases confidence otherwise.

While calibration has become a widely-accepted technique for neural networks, there is very little work that applies it to knowledge graph models. This gap is because knowledge graph models operate under the open-world assumption (missing data is not necessarily negative) and negative instances are not always available in a knowledge graph. My work involves improving calibration techniques for knowledge graph models using different synthetic negative generation strategies as well as adding semantic meaning to the confidence scores predicted by the model.

Bio:
I am a MS student in the Computer Science department working with Professor Carlos Rivero. I completed my Bachelor of Engineering in Information Science and Engineering in Bangalore, India and have long been interested in research in the intersection of data and machine learning.

Faculty Advisor: Carlos R. Rivero


Contact
Jordan Gates
585-475-2994
Event Snapshot
When and Where
February 04, 2021
12:30 pm - 1:30 pm
Room/Location: Zoom
Who

Open to the Public

Interpreter Requested?

No

Topics
interdisciplinary studies
research