一种基于信息熵的图神经网络推理加速方法
首发时间:2024-04-03
摘要:由于图数据规模的指数级增长以及大量的模型参数,图神经网络(Graph Neural Network, GNN)通常面临着高昂的计算成本,这在实际应用中构成了重大限制。近期的研究工作开始聚焦于应用彩票假设(Lottery Ticket Hypothesis, LTH)来进行GNN的稀疏化,目的是在不损害性能的前提下减少推理成本。这种方法通过精简图结构和模型参数来实现,旨在提高GNN的计算效率和实用性。然而,基于LTH存在两个主要缺点:1)它们需要对密集模型进行详尽的迭代训练,导致训练计算成本极大;2)它们对模型参数采用非结构化剪枝,在目前无法得到良好的实际加速效果。为了克服上述限制,本文提出了一种基于信息熵的图神经网络剪枝方法,称为EBP。本方法识别并去除对最终预测影响最小的边连接和模型参数通道,实现模型的加速,保证了剪枝过程不会对模型性能产生显著的负面影响。实验结果表明,该剪枝方法能够模型性能和运算效率之间达到较好的平衡。结论显示,基于信息熵的剪枝策略为GNN模型提供了一种有效的加速手段,有助于提高GNN在实际应用中的可行性和效率。
For information in English, please click here
An Information Entropy based Inference Acceleration Method for Graph Neural Networks
Abstract:Due to the exponential growth of graph data and the large number of model parameters, Graph Neural Networks (GNNs) often face high computational costs, which significantly limit their practical application. Recent research has focused on applying the Lottery Ticket Hypothesis (LTH) for the sparsification of GNNs, with the aim of reducing inference costs without compromising performance. This approach, achieved by streamlining the graph structure and model parameters, is designed to enhance the computational efficiency and practicality of GNNs. However, LTH-based methods have two main drawbacks: 1) they require exhaustive iterative training of dense models, leading to substantial training computational costs; and 2) they employ unstructured pruning of model parameters, which currently fails to achieve satisfactory practical acceleration. To overcome these limitations, this paper proposes an information entropy based pruning method for Graph Neural Networks, termed EBP. This method identifies and removes the least impactful edges and model parameter channels for final predictions, accelerating the model while ensuring that the pruning process does not adversely affect model performance. Experimental results show that this pruning method can achieve a good balance between model performance and computational efficiency. The conclusion demonstrates that the entropy-based pruning strategy provides an effective means of accelerating GNN models, facilitating the improvement of GNNs\' feasibility and efficiency in practical applications.
Keywords: artificial intelligence graph neural network graph sparsification model pruning
基金:
引用
No.****
动态公开评议
共计0人参与
勘误表
一种基于信息熵的图神经网络推理加速方法
评论
全部评论