Lately, graph neural community (GNN) based mostly fashions confirmed promising leads to simulating complicated bodily methods. Nonetheless, coaching devoted graph community simulator will be pricey, as most fashions are confined to completely supervised coaching. In depth knowledge generated from conventional simulators is required to coach the mannequin. It remained unexplored how switch studying could possibly be utilized to enhance the mannequin efficiency and coaching effectivity. On this work, we introduce a pretraining and switch studying paradigm for graph community simulator.
First, We proposed the scalable graph U-net (SGUNet). By incorporating an modern depth-first search (DFS) pooling, the SGUNet is configurable to adaptable totally different mesh dimension and resolutions for various simulation duties. To allow the switch studying between totally different configured SGUNet, we suggest a set of mapping features to align the parameters between pretrained mannequin and goal mannequin. An additional normalization time period can also be added into loss to constrain the similarity between the pretrained weights and goal mannequin weights for higher generalization efficiency. Then we created a dataset for pretraining the simulators. It contains 20,000 bodily simulations with 3D shapes randomly chosen from the open supply A Large CAD (ABC) datasets. We show that with our proposed switch studying strategies, mannequin fine-tuned with a small portion of the coaching knowledge may attain even higher efficiency in contrast with the one skilled from scratch. On 2D Deformable Plate, our pretrained mannequin fine-tuned on 1/16 of the coaching knowledge may obtain 11.05% enchancment in comparison with mannequin skilled from scratch.