Loss convergence
E4H flavour
LMS, École Polytechnique, Paris, France
September 10, 2025
{ width=80% loop="true" autoplay=true}Default box title
mybox looks likeSecond box version
\[ \definecolor{Violet}{RGB}{162, 32, 185} \definecolor{Teal}{RGB}{0, 103, 127} \definecolor{Blue}{RGB}{58, 70, 245} \definecolor{Green}{RGB}{0,103,127} \definecolor{LGreen}{RGB}{62,128,102} \definecolor{red}{RGB}{206,0,55} \]
\[ \boldsymbol{u} = \mathop{\mathrm{arg\,min}}_{H^1\left(\Omega \right)} \int_{\Omega} \Psi\left( \boldsymbol{E}\left(\boldsymbol{u}\left(\textcolor{Blue}{\boldsymbol{x}}, \textcolor{LGreen}{\left\{\mu_i\right\}_{i \in \mathopen{~[\!\![~}1, \beta \mathclose{~]\!\!]}}}\right)\right) \right) ~\mathrm{d}\Omega- W_{\text{ext}}\left(\textcolor{Blue}{\boldsymbol{x}}, \textcolor{LGreen}{\left\{\mu_i\right\}_{i \in \mathopen{~[\!\![~}1, \beta \mathclose{~]\!\!]}}}\right) \]
Works just as well in boxes
\(\Psi = \frac{\lambda}{2} \text{tr}\left(\boldsymbol{E}\right)^2 + \mu \boldsymbol{E}:\boldsymbol{E}\)
And you can
|
|
|
|
\[ \mathcal{U}_h = \left\{\boldsymbol{u}_h \; | \; \boldsymbol{u}_h \in \text{Span}\left( \left\{ N_i^{\Omega}\left(\boldsymbol{x} \right)\right\}_{i \in \mathopen{~[\!\![~}1,N\mathclose{~]\!\!]}} \right)^d \text{, } \boldsymbol{u}_h = \boldsymbol{u}_d \text{ on }\partial \Omega_d \right\} \]
.green-bullets environment.red-bullets environment\[ \boldsymbol{u} \left(x_{0,0,0} \right) = \sum\limits_{i = 0}^C \sum\limits_{j = 0}^{N_i} \sigma \left( \sum\limits_{k = 0}^{M_{i,j}} b_{i,j}+\omega_{i,j,k}~ x_{i,j,k} \right) \]
.bib file, which appears in the bibliography list.(Chinesta et al., 2011; Ladeveze, 1985)Some content
.r-stack div to allow selecting several element to be put on top of each other
fragment to incrementally reveal elements
data-id="xxx" to name the fragments\[ \textcolor{VioletLMS_2}{\boldsymbol{u}}\left(\textcolor{Blue}{\boldsymbol{x}}, \textcolor{LGreen}{\left\{\mu_i\right\}_{i \in \mathopen{~[\!\![~}1, \beta \mathclose{~]\!\!]}}}\right) = \sum\limits_{i=1}^m \textcolor{Blue}{\overline{\boldsymbol{u}}_i(\boldsymbol{x})} ~\textcolor{LGreen}{\prod_{j=1}^{\beta}\lambda_i^j(\mu^j)}\]
.fragment fragment and fragment-index to incrementally replace the equations\[ \boldsymbol{u}\left(\boldsymbol{x}, \left\{\mu_i\right\}_{i \in \mathopen{~[\!\![~}1, \beta \mathclose{~]\!\!]}}\right) = \overline{\boldsymbol{u}}(\boldsymbol{x}) ~\prod_{j=1}^{\beta}\lambda^j(\mu^j) \]
\[ \boldsymbol{u}\left(\boldsymbol{x}, \left\{\mu_i\right\}_{i \in \mathopen{~[\!\![~}1, \beta \mathclose{~]\!\!]}}\right) = \textcolor{Red}{\sum\limits_{i=1}^{2}} \overline{\boldsymbol{u}}_{\textcolor{Red}{i}}(\boldsymbol{x}) ~\prod_{j=1}^{\beta}\lambda_{\textcolor{Red}{i}}^j(\mu^j) \]
\[ \boldsymbol{u}\left(\boldsymbol{x}, \left\{\mu_i\right\}_{i \in \mathopen{~[\!\![~}1, \beta \mathclose{~]\!\!]}}\right) = \sum\limits_{i=1}^{\textcolor{Red}{m}} \overline{\boldsymbol{u}}_i(\boldsymbol{x}) ~\prod_{j=1}^{\beta}\lambda_i^j(\mu^j) \]
Incremental reveal
Loss convergence
Loss decay
Using plotly
.csv from the results and loading them to create the plotsiframe to embed 3D html