Provably expressive temporal graph networks

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorSouza, Amauri H.en_US
dc.contributor.authorMesquita, Diegoen_US
dc.contributor.authorKaski, Samuelen_US
dc.contributor.authorGarg, Vikasen_US
dc.contributor.departmentDepartment of Computer Scienceen
dc.contributor.editorKoyejo, S.en_US
dc.contributor.editorMohamed, S.en_US
dc.contributor.editorAgarwal, A.en_US
dc.contributor.editorBelgrave, D.en_US
dc.contributor.editorCho, K.en_US
dc.contributor.editorOh, A.en_US
dc.contributor.groupauthorProbabilistic Machine Learningen
dc.contributor.groupauthorProfessorship Kaski Samuelen
dc.contributor.groupauthorComputer Science Professorsen
dc.contributor.groupauthorComputer Science - Artificial Intelligence and Machine Learning (AIML) - Research areaen
dc.contributor.groupauthorFinnish Center for Artificial Intelligence, FCAIen
dc.contributor.groupauthorHelsinki Institute for Information Technology (HIIT)en
dc.contributor.groupauthorProfessorship Garg Vikasen
dc.date.accessioned2023-06-05T04:41:48Z
dc.date.available2023-06-05T04:41:48Z
dc.date.issued2022en_US
dc.description| openaire: EC/H2020/951847/EU//ELISE
dc.description.abstractTemporal graph networks (TGNs) have gained prominence as models for embedding dynamic interactions, but little is known about their theoretical underpinnings. We establish fundamental results about the representational power and limits of the two main categories of TGNs: those that aggregate temporal walks (WA-TGNs), and those that augment local message passing with recurrent memory modules (MP-TGNs). Specifically, novel constructions reveal the inadequacy of MP-TGNs and WA-TGNs, proving that neither category subsumes the other. We extend the 1-WL (Weisfeiler-Leman) test to temporal graphs, and show that the most powerful MP-TGNs should use injective updates, as in this case they become as expressive as the temporal WL. Also, we show that sufficiently deep MP-TGNs cannot benefit from memory, and MP/WA-TGNs fail to compute graph properties such as girth. These theoretical insights lead us to PINT --- a novel architecture that leverages injective temporal message passing and relative positional features. Importantly, PINT is provably more expressive than both MP-TGNs and WA-TGNs. PINT significantly outperforms existing TGNs on several real-world benchmarks.en
dc.description.versionPeer revieweden
dc.format.extent13
dc.identifier.citationSouza, A H, Mesquita, D, Kaski, S & Garg, V 2022, Provably expressive temporal graph networks. in S Koyejo, S Mohamed, A Agarwal, D Belgrave, K Cho & A Oh (eds), Advances in Neural Information Processing Systems 35 (NeurIPS 2022). Advances in Neural Information Processing Systems, vol. 35, Morgan Kaufmann Publishers, Conference on Neural Information Processing Systems, New Orleans, Louisiana, United States, 28/11/2022. < https://proceedings.neurips.cc/paper_files/paper/2022/hash/d029c97ee0db162c60f2ebc9cb93387e-Abstract-Conference.html >en
dc.identifier.isbn978-1-7138-7108-8
dc.identifier.issn1049-5258
dc.identifier.otherPURE UUID: aa2f1e41-c6c0-4bd6-97c1-d0574c42bbdeen_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/aa2f1e41-c6c0-4bd6-97c1-d0574c42bbdeen_US
dc.identifier.otherPURE LINK: https://proceedings.neurips.cc/paper_files/paper/2022/hash/d029c97ee0db162c60f2ebc9cb93387e-Abstract-Conference.htmlen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/121224
dc.identifier.urnURN:NBN:fi:aalto-202306053606
dc.language.isoenen
dc.relationinfo:eu-repo/grantAgreement/EC/H2020/951847/EU//ELISEen_US
dc.relation.ispartofConference on Neural Information Processing Systemsen
dc.relation.ispartofseriesAdvances in Neural Information Processing Systems 35 (NeurIPS 2022)en
dc.relation.ispartofseriesAdvances in Neural Information Processing Systems ; Volume 35en
dc.rightsopenAccessen
dc.titleProvably expressive temporal graph networksen
dc.typeA4 Artikkeli konferenssijulkaisussafi

Files