Heterogeneous solids exsit widely in our everyday life, such as concretes and various composite materials. Recently, pixelization of the CAD models for these solids seems interesting. Following is an example that shows the pixelization of a unit cubic cell filled with random particles. The process just simply converts a CAD file into a discrete format that can be adopted by deep-learning software, such as Tensorflow.
Caption: (a) showing the unit cell (in red) with spherical particles (in green) (b) showing the profiles obtained by cutting through 3 different planes. The pixelization is realized by a Python script.
Polar dielectric elastomers (PDEs) are electrically insulating soft materials where polar groups are chemically incorporated into material microstructures. These materials are pretty new and show promising applications, especially for low-electric-field driving soft robotics and devices, to name a few.
Caption: A schematic showing the microstructure of a polymer network with polar groups.
One prominent feature of PDEs is the dielectric permittivity enhancement effect of polar groups. In one of my recent papers, a mathematical formulation is proposed to shed light on the physical insight of the effect of polar groups. It is found that the effective dielectric constant is propotional to the fraction of polar groups per polymer chain in PDEs. Based on the finding, a concept called "dielectric imperfection" is proposed under the assumption of inhomogeneous distribution of polar groups in PDEs. A thorough discussion is given in the paper.
Physical laws in our four dimensional world (time + 3D space) are usually mathematically abstracted in the form of partial differential equations. These equations are therefore broadly applied in various boundary value problems(BVPs) arising from natural phenomena or engineering practice, typically, such as wave propagation, heat transfer and elasticity problems. To solve these BVPs, analytical methods (e.g. seperation of variable method, eigenfunction expansion method, Laplace transform method) are limited to simple configurations. As the development of computer techniques, numerical methods (e.g. finite element method, finite volume method and finite difference method) show increasingly more powerful capability in solving these BVPs. A natural trend is the fution of the computer techniques and engineering. Thus, numerous engineering software companies (ANSYS, Dassault, Altair, MSC, etc.) emerged during the past several decades.
In recent years, the keyword "data" attracts people's attention due to the highly developed internet and communication techniques. That is the main reason why internet companies (Google, Amazon and many others) were trying to mine the values of the collected data underlying the internet. A trend is that machine learning revives, more precisely speaking "deep learning". One branch of the hot areas is the design of various neural networks(such as LSTM, CNN, LSTM-CNN), which form the basis of the deep learning techniques and are deemed as universal approximaters. It seems that the "art of fitting"(data modeling) is gradually approaching the "art of reasoning"(mathematical modeling) if dataset for training is large enough.
Engineering data are also valuable and widely accessible. How can engineering problems in data forms be investigated by deep learning. In general, there are two topics: 1) data <-> physical laws, 2) data <-> solution of BVPs. There have been tremendous researches on these two topics. It is questionable that the deep learning techniques can surpass mathematical reasoning and traditional numerical methods. The first question is how large the dataset should be? The second question is the portability of the trained nueral networks, that is to say, once a neural network is trained for a specific BVP, it can not be accommodated to another one. Thus, the deep learning techniques for discovery or solution still cease at a very limited level while the well-documented framework "Tensorflow" seems considerably powerful (e.g. graph calculation scheme and symbolic operators), which opens doors for researches in many different disciplines.
An example about physics informed deep learning is shown as following
Caption: (a) A schematic showing a cantilever beam subject to a tip time-varying force. The displacement data is extracted from the 3 points marked by red cross. (b) showing the geometry parameters, material parameters and the force function. (c) the prediction compared with the exact one.
The source code and training dataset can be found here: script-> Beam and data-> Data
As a tutorial, this post shows a simple and extensible user material subroutine (UMAT), where large deformation is considered. For simplicity, the hypoelasticiy is adopted as the material law. The robustness and correctness of the general UMAT is demonstrated in two examples: one is the uniaxial extention test of a single element; the other is the triaxial compression test of a single element.
Caption: (a) uniaxial extension test of a single element (Force v.s. extention). (b) triaxial compression test of a single element (Volume strain v.s. compression). Both examples show the consistent results with the one obtained by the built-in subroutine.
Detailed comments are added in the source code for explanation. The framework designed in the source code can be easily applied to other material laws, such as hyperelasticity, plasticity and viscoelasticity.
Computational multiscale multiphysics analysis of soft composites seems for me to be an interesting topic, where time-dependent material nonlinearity, large deformations, multiphysics coulping and multiscale information passing are accounted for. Out of my personal interest, a finite element package is developt. The project was started in Dec. 2018. It is now still under development. The so-called Nested finite element method(Nested FEM) is used in the calculation. Our calculation capability can demonstrate the effect of complex material heterogeneity ranging from different spatial scales. Herein, we use the dielectric elastomer composites (DECs) as an example(see the following figure).
Caption: (a) showing the initial configuration of a bulk DEC brick. (b) showing the deformed configuration of the DEC brick subject to an electric field. (c) showing the heterogenerous structure of the DEC brick, the dielectric permittivities are different in the red regions (inclusion) and the green region (matrix). (d) showing the profile of an intersection. (e) For a single phase material, Nested FEM (inset showing the unit cell) achieves a consistent solution with the analytical one. (f) For a two phase material, the dielectric permittivity of the inclusions is x#(#=1,2,4,6,8) times larger than the one of the matrix (inset showing the unit cell). The limit point, i.e, the largest voltage load, is decreased as the dielectric permittivity of the inclusions increases. Both Nested FEM and direct numerical simulation confirm the same result.
If one is also interested in this project, you are welcomed to join in developing the package.
The first project that I had spent several months to finish in late 2015 when I was pursuing a PhD at the University of Southern California. In order to evaluate the material properties of the architected metamaterials with different lattice structures, they are designed in a CAD software and fabricated by a 3D printer.
This is one of my favorite research projects during the time I was studying at the USC. I spent a lot of time and effort on this. The following experimental picture shows that a ferromagnetic elastomer octet-truss lattice is deformed subject to a magnetic field.