DSpace logo

Please use this identifier to cite or link to this item: http://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/19227
Full metadata record
DC FieldValueLanguage
dc.contributor.authorSinha, Yash-
dc.date.accessioned2025-08-25T10:12:55Z-
dc.date.available2025-08-25T10:12:55Z-
dc.date.issued2023-09-
dc.identifier.urihttps://arxiv.org/abs/2309.16173-
dc.identifier.urihttp://dspace.bits-pilani.ac.in:8080/jspui/handle/123456789/19227-
dc.description.abstractGraph unlearning has emerged as a pivotal method to delete information from a pre-trained graph neural network (GNN). One may delete nodes, a class of nodes, edges, or a class of edges. An unlearning method enables the GNN model to comply with data protection regulations (i.e., the right to be forgotten), adapt to evolving data distributions, and reduce the GPU-hours carbon footprint by avoiding repetitive retraining. Existing partitioning and aggregation-based methods have limitations due to their poor handling of local graph dependencies and additional overhead costs. More recently, GNNDelete offered a model-agnostic approach that alleviates some of these issues. Our work takes a novel approach to address these challenges in graph unlearning through knowledge distillation, as it distills to delete in GNN (D2DGN). It is a model-agnostic distillation framework where the complete graph knowledge is divided and marked for retention and deletion. It performs distillation with response-based soft targets and feature-based node embedding while minimizing KL divergence. The unlearned model effectively removes the influence of deleted graph elements while preserving knowledge about the retained graph elements. D2DGN surpasses the performance of existing methods when evaluated on various real-world graph datasets by up to (AUC) in edge and node unlearning tasks. Other notable advantages include better efficiency, better performance in removing target elements, preservation of performance for the retained elements, and zero overhead costs. Notably, our D2DGN surpasses the state-of-the-art GNNDelete in AUC by , improves membership inference ratio by , requires fewer FLOPs per forward pass and up to faster.en_US
dc.language.isoenen_US
dc.subjectComputer Scienceen_US
dc.subjectGraph unlearningen_US
dc.subjectGraph neural networks (GNNs)en_US
dc.subjectKnowledge distillationen_US
dc.subjectData deletionen_US
dc.subjectNode and edge removalen_US
dc.titleDistill to delete: unlearning in graph networks with knowledge distillationen_US
dc.typePreprinten_US
Appears in Collections:Department of Computer Science and Information Systems

Files in This Item:
There are no files associated with this item.


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.