Sun Research Group at Columbia University
  • Home
  • PI
  • Team Members
  • Publications
  • Research
  • Teaching
  • Software & Data
  • Presentations
  • Computational Mechanics with AI
  • Home
  • PI
  • Team Members
  • Publications
  • Research
  • Teaching
  • Software & Data
  • Presentations
  • Computational Mechanics with AI

PhD student Bahador Bahmani passed the qualification exam and won the Dongju Lee '03 Memorial Award

5/26/2023

0 Comments

 
Our PhD student Bahador Bahmani has passed the qualification exam at Columbia University. His proposed dissertiation "Manifold-embedding data-driven mechanics" focuses on applying manifold learning to enable robust model-free simulations. 
Picture
He has published 6 journal articles, including two papers in CMAME and one in JMPS on the topic of data-driven simulations. 

  1. B. Bahmani, W.C. Sun, Distance-preserving manifold de-noising for data-driven mechanics, Computer Methods in Applied Mechanics and Engineering, doi:10.1016/j.cma.2022.115857, 2022. [URL]​​
  2. B. Bahmani, W.C. Sun, Manifold embedding data-driven mechanics,  Journal of the Mechanics and Physics of Solids, doi:10.1016/j.jmps.2022.104927, 2022. [PDF]
  3. B. Bahmani, W.C. Sun, A kd-tree accelerated hybrid data-driven/model-based approach for poroelasticity problems with multi-fidelity multi-physics data, Computer Methods in Applied Mechanics and Engineering, 2021. [Video] [PDF]
  
In addition, he has also collaborated in our collaborative NSF project and MURI projects and published papers on causal discovery of granular physics, equviariant geometric learning for digital rock physics, and Generative Adversartial Network for generating microstructures (see below). 

  1. X. Sun, B. Bahmani, N. Vlassis, W.C. Sun, Y. Xu, Data-driven discovery of interpretable causal relations for deep learning material laws with uncertainty propagation, Granular Matter, doi:10.1007/s10035-021-01137-y, 2021. 
  2. C. Cai, N. Vlassis, L. Magee, R. Ma", Z. Xiong, B. Bahmani, T-F Wong, Y. Wang, W.C. Sun, Equivariant geometric learning for digital rock physics. Part I: Estimating formation factor and effective permeability tensors, International Journal for Multiscale Computation and Engineering, doi:10.1615/IntJMultCompEng.2022042266, 2023. [arxiv]

  3. P.C.H.Nguyen, B. Bahmani, N.N. Vlassis, W.C. Sun, H.S. Udaykumar, S.S. Baek, Synthesizing Controlled Microstructures of Porous Media using Generative Adversarial Networks and Reinforcement Learning, Scientific Reports, 12:9034, 2022. [URL]

Picture
The Dongju Lee memorial award is given in recognition of Bahador's achievement and in honor of his integrity, curiosity and creativity. The Dongju Lee Memorial Award and Memorial Lecture were established with a generous contribution from the Lee Family. 

Congratulations, Bahador for your outstanding achievement! 
0 Comments

Our paper with Prof. Vlassis on denoising-diffusion generative model for microstructure design with fined-tuned nonlinear properties has been accepted by CMAME.

5/12/2023

1 Comment

 
The paper preprint is available at [URL] In this paper, we (first author = Nick Vlassis) introduce a denoising diffusion algorithm to discover microstructures with nonlinear fine-tuned properties. Denoising diffusion probabilistic models are generative models that use diffusion-based dynamics to gradually denoise images and generate realistic synthetic samples. By learning the reverse of a Markov diffusion process, we design an artificial intelligence to efficiently manipulate the topology of microstructures to generate a massive number of prototypes that exhibit constitutive responses sufficiently close to designated nonlinear constitutive responses.
Picture
While the unconditional diffusion described in the previous section can readily generate microstructures consistent with the training data set, our goal is to design microstructures that exhibit prescribed mechanical behaviors. To achieve this goal, we use a conditional diffusion process which fine-tunes the resultant microstructures via feature vectors.
Picture
To identify the subset of micro-cstructures with sufficiently precise fine-tuned properties, a convolution neural network surrogate is trained to replace high-fidelity finite element simulations to filter out prototypes outside the admissible range. Results of this study indicate that the denoising diffusion process is capable of creating microstructures of fine-tuned nonlinear material properties within the latent space of the training data. More importantly, this denoising diffusion algorithm can be easily extended to incorporate additional topological and geometric modifications by introducing high-dimensional structures embedded in the latent space. Numerical experiments is conducted via the open-source mechanical MNIST data set created by Prof. Lejeune research group (See below). Consequently, this algorithm is not only capable of performing inverse design of nonlinear effective media, but also learns the nonlinear structure-property map to quantitatively understand the multi-scale interplays among the geometry, topology, and their effective macroscopic properties.
Picture
1 Comment

Collaborative work with Sandia National Lab on design/planning of experiments with Kalman  filter + deep reinforcement learning has published in Computational Mechanics

5/12/2023

0 Comments

 
Our collaborative work with Sandia funded by the LDRD project led by Dr. Sharlotte Kramer has just published in the special issue "Machine Learning Theories, Modeling, and Applications to Computational Materials Science, Additive Manufacturing, Mechanics of Materials, Design and Optimization". The article is available in the Computational Mechanics website [URL].

Experimental data are often costly to obtain, which makes it difficult to calibrate complex models. For many models an experimental design that produces the best calibration given a limited experimental budget is not obvious. This paper introduces a deep reinforcement learning (RL) algorithm for design of experiments that maximizes the information gain measured by Kullback–Leibler divergence obtained via the Kalman filter (KF), see figure below.  
Picture
This combination enables experimental design for rapid online experiments where manual trial-and-error is not feasible in the high-dimensional parametric design space. We formulate possible configurations of experiments as a decision tree and a Markov decision process, where a finite choice of actions is available at each incremental step. 
Picture
Once an action is taken, a variety of measurements are used to update the state of the experiment. This new data leads to a Bayesian update of the parameters by the KF, which is used to enhance the state representation. In contrast to the Nash–Sutcliffe efficiency index, which requires additional sampling to test hypotheses for forward predictions, the KF can lower the cost of experiments by directly estimating the values of new data acquired through additional actions. In this work our applications focus on mechanical testing of materials. Numerical experiments with complex, history-dependent models are used to verify the implementation and benchmark the performance of the RL-designed experiments.
Picture
0 Comments

Prof. Sun won Walter Huber Prize from ASCE

5/7/2023

0 Comments

 

I am deeply honored to receive the 2023 Walter Huber Civil Engineering Research Prize from ASCE. Many thanks to my current and former students and postdocs, who are the necessary condition for it to happen. #civilengineering #ASCE https://t.co/9hqu0FRCuo

— Steve Sun (@Poromechanics) March 31, 2023
0 Comments

New paper on PyTorch-t0-FORTRAN UMAT implementation of level set plasticity accepted by Mechanics of Materials

5/6/2023

0 Comments

 
Picture
.Our collaborative research with Sandia National Laboratories on pyTorch-UMAT implementation for machine learning models has been accepted by Mechanics of Materials (See preprint [PDF]).
​

This paper introduces a publicly available PyTorch-ABAQUS deep-learning framework of a family of plasticity models where the yield surface is implicitly represented by a scalar-valued function. Our goal is to introduce a practical framework that can be deployed for engineering analysis that employs a user-defined material subroutine (UMAT/VUMAT) for ABAQUS, which is written in FORTRAN (see below)




To accomplish this task while leveraging the back-propagation learning algorithm to speed up the neural-network training, we introduce an interface code where the weights and biases of the trained neural networks obtained via the PyTorch library can be automatically converted into a generic FORTRAN code that can be a part of the UMAT/VUMAT algorithm. To enable third-party validation, we purposely make all the data sets, source code used to train the neural-network-based constitutive models, and the trained models available in a public repository. See the link below:

https://github.com/hyoungsuksuh/ABAQUS_NN

A variety of options (see below) of NN architecture has been pre-trained (see below)..

Picture
Benchmark material point simulations and finite element simulations in ABAQUS has been provided in the repository.  Please feel free to modify the codes and we would appreciate that if you can cite this paper if you use it for your own research. 

Note: we are actively developing this repository which may contain bugs. If you encounter a bug, please let us (Hyoung Suk Suh, [email protected]; WaiChing Sun, [email protected]) know. Please cite our work if you use it for your own research. We hope that this small tool can encourage and help more researchers from the ABAQUS ecosystem to build their own neural network model. Thank you! 

Reference: 
  • H.S. Suh, C. Kweon, B. Lester, S. Kramer, W.C. Sun, A publicly available e PyTorch-ABAQUS UMAT deep-learning framework for level-set plasticity, Mechanics of Materials, in press, 2023. 


0 Comments

    Group News

    News about Computational Poromechanics lab at Columbia University.

    Categories

    All
    Invited Talk
    Job Placements
    Journal Article
    Presentation
    Special Events

    Archives

    July 2023
    June 2023
    May 2023
    March 2023
    December 2022
    November 2022
    August 2022
    July 2022
    May 2022
    April 2022
    March 2022
    December 2021
    November 2021
    October 2021
    September 2021
    August 2021
    July 2021
    June 2021
    May 2021
    April 2021
    March 2021
    February 2021
    January 2021
    October 2020
    August 2020
    July 2020
    June 2020
    May 2020
    February 2020
    January 2020
    December 2019
    September 2019
    July 2019
    June 2019
    May 2019
    April 2019
    March 2019
    February 2019
    December 2018
    October 2018
    September 2018
    August 2018
    July 2018
    June 2018
    May 2018
    April 2018
    March 2018
    January 2018
    December 2017
    November 2017
    October 2017
    September 2017
    August 2017
    July 2017
    June 2017
    May 2017
    April 2017
    March 2017
    February 2017
    January 2017
    December 2016
    November 2016
    October 2016
    May 2016
    April 2016
    March 2016
    February 2016
    January 2016
    November 2015
    October 2015
    September 2015
    August 2015
    July 2015
    June 2015
    May 2015
    March 2015
    February 2015
    January 2015
    December 2014
    November 2014
    October 2014
    September 2014
    August 2014
    July 2014
    June 2014
    May 2014
    April 2014
    March 2014
    February 2014
    January 2014
    November 2013
    September 2013

    RSS Feed

Contact Information
Prof. Steve Sun
Phone: 212-851-4371 
Fax: +1 212-854-6267
Email: [email protected]
Copyright @ 2014-2025.  All rights reserved.