Building-GNN: Exploring a co-design framework for generating controllable 3D building prototypes by graph and recurrent neural networks

dc.contributorAalto-yliopistofi
dc.contributorAalto Universityen
dc.contributor.authorZhong, Ximingen_US
dc.contributor.authorKoh, Immanuelen_US
dc.contributor.authorFricker, Prof. Dr. Piaen_US
dc.contributor.departmentDepartment of Architectureen
dc.contributor.editorDokonal, Wolfgangen_US
dc.contributor.editorHirschberg, Ursen_US
dc.contributor.editorWurzer, Gabrielen_US
dc.contributor.organizationSingapore University of Technology and Designen_US
dc.date.accessioned2023-10-11T09:35:12Z
dc.date.available2023-10-11T09:35:12Z
dc.date.issued2023en_US
dc.descriptionPublisher Copyright: © 2023, Education and research in Computer Aided Architectural Design in Europe. All rights reserved.
dc.description.abstractThis paper discusses a novel deep learning (DL)framework named Building-GNN, which combines the Graph Neural Network (GNN) and the Recurrent neural network (RNN) to address the challenge of generating a controllable 3D voxel building model. The aim is to enable architects and AI to jointly explore the shape and internal spatial planning of 3D building models, forming a co-design paradigm. While the 3D results of previous DL methods, such as 3DGAN, are challenging to control in detail and meet the constraints and preferences of architects' inputs, Building-GNN allows for reasoning about the complex constraint relationships between each voxel. In Building-GNN, the GNN simulates and learns the graph structure relationship between 3D voxels, and the RNN captures the complex interplaying constraint relationships between voxels. The training set consists of 4000 rule-based generated 3D voxel models labeled with different degrees of masking. The quality of the 3D results is evaluated using metrics such as IoU, Fid, and constraint satisfaction. The results demonstrate that adding RNN enhances the accuracy of 3D model shape and voxel relationship prediction. Building-GNN can perform multi-step rational reasoning to complete the 3D model layout planning in different scenarios based on the architect's precise control and incomplete input.en
dc.description.versionPeer revieweden
dc.format.extent10
dc.format.extent431-440
dc.format.mimetypeapplication/pdfen_US
dc.identifier.citationZhong, X, Koh, I & Fricker, P D P 2023, Building-GNN: Exploring a co-design framework for generating controllable 3D building prototypes by graph and recurrent neural networks . in W Dokonal, U Hirschberg & G Wurzer (eds), Digital Design Reconsidered : Proceedings of the 41st Conference on Education and Research in Computer Aided Architectural Design in Europe (eCAADe 2023) . vol. 2, eCAADe proceedings, eCAADe, Brussels, pp. 431-440, International Conference on Education and Research in Computer Aided Architectural Design in Europe, Graz, Austria, 20/09/2023 . https://doi.org/10.52842/conf.ecaade.2023.2.431en
dc.identifier.doi10.52842/conf.ecaade.2023.2.431en_US
dc.identifier.isbn9789491207358
dc.identifier.issn2684-1843
dc.identifier.otherPURE UUID: 72dfbd04-a3fb-4cc0-9253-ed5366ba20e1en_US
dc.identifier.otherPURE ITEMURL: https://research.aalto.fi/en/publications/72dfbd04-a3fb-4cc0-9253-ed5366ba20e1en_US
dc.identifier.otherPURE LINK: http://www.scopus.com/inward/record.url?scp=85172474433&partnerID=8YFLogxKen_US
dc.identifier.otherPURE FILEURL: https://research.aalto.fi/files/124119094/Building_GNN_Zhong_Koh_Fricker_2023_pdfa1b.pdfen_US
dc.identifier.urihttps://aaltodoc.aalto.fi/handle/123456789/123909
dc.identifier.urnURN:NBN:fi:aalto-202310116256
dc.language.isoenen
dc.publishereCAADe
dc.relation.ispartofInternational Conference on Education and Research in Computer Aided Architectural Design in Europeen
dc.relation.ispartofseriesDigital Design Reconsidereden
dc.relation.ispartofseriesVolume 2en
dc.relation.ispartofserieseCAADe proceedingsen
dc.rightsopenAccessen
dc.titleBuilding-GNN: Exploring a co-design framework for generating controllable 3D building prototypes by graph and recurrent neural networksen
dc.typeConference article in proceedingsfi
dc.type.versionpublishedVersion
Files