Skip to content

Commit 7b4e53e

Browse files
committed
first commit
1 parent 745372f commit 7b4e53e

16 files changed

+834
-11506
lines changed

_freeze/index/execute-results/docx.json

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,8 +1,8 @@
11
{
2-
"hash": "ae66ddeddd15b4b93302fc30e2f9e751",
2+
"hash": "ac59b1676fc1eebad5b1e59e7f6e38e2",
33
"result": {
44
"engine": "jupyter",
5-
"markdown": "---\ntitle: La Palma Earthquakes\nauthor:\n - name: Steve Purves\n orcid: 0000-0002-0760-5497\n corresponding: true\n email: steve@curvenote.com\n roles:\n - Investigation\n - Project administration\n - Software\n - Visualization\n affiliations:\n - Curvenote\n - name: Rowan Cockett\n orcid: 0000-0002-7859-8394\n corresponding: false\n roles: []\n affiliations:\n - Curvenote\nkeywords:\n - La Palma\n - Earthquakes\nabstract: |\n In September 2021, a significant jump in seismic activity on the island of La Palma (Canary Islands, Spain) signaled the start of a volcanic crisis that still continues at the time of writing. Earthquake data is continually collected and published by the Instituto Geográphico Nacional (IGN). ...\nplain-language-summary: |\n Earthquake data for the island of La Palma from the September 2021 eruption is found ...\nkey-points:\n - A web scraping script was developed to pull data from the Instituto Geogràphico Nacional into a machine-readable form for analysis\n - Earthquake events on La Palma are consistent with the presence of both mantle and crustal reservoirs.\ndate: last-modified\nbibliography: references.bib\ncitation:\n container-title: Earth and Space Science\nnumber-sections: true\njupyter: python3\n---\n\n\n\n\n\n\n## Introduction\n\n::: {.cell execution_count=1}\n``` {.python .cell-code .hidden}\nimport matplotlib.pyplot as plt\nimport numpy as np\neruptions = [1492, 1585, 1646, 1677, 1712, 1949, 1971, 2021]\n```\n:::\n\n\n::: {.cell execution_count=2}\n``` {.python .cell-code .hidden}\nplt.figure(figsize=(6, 1))\nplt.eventplot(eruptions, lineoffsets=0, linelengths=0.1, color='black')\nplt.gca().axes.get_yaxis().set_visible(False)\nplt.ylabel('')\nplt.show()\n```\n\n::: {.cell-output .cell-output-display}\n![Timeline of recent earthquakes on La Palma](index_files/figure-docx/fig-timeline-output-1.png){#fig-timeline fig-alt='An event plot of the years of the last 8 eruptions on La Palma.'}\n:::\n:::\n\n\n::: {.cell execution_count=3}\n``` {.python .cell-code .hidden}\navg_years_between_eruptions = np.mean(np.diff(eruptions[:-1]))\navg_years_between_eruptions\n```\n\n::: {.cell-output .cell-output-display .hidden execution_count=30}\n```\nnp.float64(79.83333333333333)\n```\n:::\n:::\n\n\n:::{#7b4f9623 .cell .markdown}\nBased on data up to and including 1971, eruptions on La Palma happen every 79\\.8 years on average.\n\nStudies of the magma systems feeding the volcano, such as @marrero2019, have proposed that there are two main magma reservoirs feeding the Cumbre Vieja volcano; one in the mantle (30-40km depth) which charges and in turn feeds a shallower crustal reservoir (10-20km depth).\n\nEight eruptions have been recorded since the late 1400s (@fig-timeline).\n\nData and methods are discussed in @sec-data-methods.\n\nLet $x$ denote the number of eruptions in a year. Then, $x$ can be modeled by a Poisson distribution\n\n$$\np(x) = \\frac{e^{-\\lambda} \\lambda^{x}}{x !}\n$$ {#eq-poisson}\n\nwhere $\\lambda$ is the rate of eruptions per year. Using @eq-poisson, the probability of an eruption in the next $t$ years can be calculated.\n\n| Name | Year |\n|---------------------|------|\n| Current | 2021 |\n| Teneguía | 1971 |\n| Nambroque | 1949 |\n| El Charco | 1712 |\n| Volcán San Antonio | 1677 |\n| Volcán San Martin | 1646 |\n| Tajuya near El Paso | 1585 |\n| Montaña Quemada | 1492 |\n\n: Recent historic eruptions on La Palma {#tbl-history}\n\n@tbl-history summarises the eruptions recorded since the colonization of the islands by Europeans in the late 1400s.\n\n![Map of La Palma](images/la-palma-map.png){#fig-map}\n\nLa Palma is one of the west most islands in the Volcanic Archipelago of the Canary Islands (@fig-map).\n\n\n\n\n\n\n{{< embed notebooks/data-screening.qmd#fig-spatial-plot >}}\n\n\n\n\n\n\n\n\n\n\n@fig-spatial-plot shows the location of recent Earthquakes on La Palma.\n\n## Data & Methods {#sec-data-methods}\n\n## Conclusion\n\n## References {.unnumbered}\n\n::: {#refs}\n:::\n:::\n\n",
5+
"markdown": "---\ntitle: Machine Learning in Materials Processing & Characterization\nsubtitle: Course Curriculum and Materials\nauthor:\n - name: Philipp Pelz\n corresponding: true\n roles:\n - Course Instructor\n - Content Development\n affiliations:\n - Materials Science and Engineering\nkeywords:\n - Machine Learning\n - Materials Science\n - Materials Processing\n - Materials Characterization\n - Deep Learning\n - Microstructure Analysis\n - Process Optimization\nabstract: |\n This course provides students with essential skills and practical knowledge to harness machine learning techniques for accelerating materials discovery and design. Specifically tailored for students interested in the new BSc program \"KI-Materialtechnologie\"/AI for materials technology\", it provides hands-on experience with core and advanced machine learning methods—including neural networks, optimization strategies, and generative modelling—to tackle real-world materials science problems. The course focuses on experimental data: microstructures, images, spectra, and processing parameters, connecting the messy, nonlinear world of processing and characterization signals with machine learning tools.\nplain-language-summary: |\n This course teaches how to apply machine learning to materials science problems, focusing on experimental data from characterization techniques (microscopy, spectroscopy) and processing parameters. Students learn to build ML pipelines for microstructure classification, process prediction, and spectral analysis, with emphasis on understanding the physics of data formation and avoiding common pitfalls in experimental ML workflows.\nkey-points:\n - Machine learning techniques for materials processing and characterization data\n - Vision-based ML for microstructure analysis and classification\n - Time-series ML for process monitoring and optimization\n - Spectral data analysis using dimensionality reduction and ML\n - Multi-modal data fusion combining images, spectra, and process parameters\ndate: last-modified\nbibliography: references.bib\nnumber-sections: true\n---\n\n:::{#592756e0 .cell .markdown}\n# Machine Learning in Materials Processing & Characterization\n\n**4th Semester – 5 ECTS, 2h lecture + 2h exercises per week**\n\n## Synergy Map\n\n- **This course**: What ML can do with experimental data: microstructures, images, spectra, processing parameters.\n\n- **Parallel ML intro course**: Teaches generic ML algorithms and image processing foundations (skimage, Fourier, wavelets, SVMs, Bayes classifiers).\n\n- **\"Materials Genomics\" course**: Focuses on materials databases, descriptors, crystal graph representations, DFT data, high-throughput workflows, surrogate models.\n \n\n## Week-by-Week Curriculum (14 weeks)\n\n### Unit I — Foundations: From Materials Signals to Machine Learning (Weeks 1–3)\n\n#### Week 1 – What makes materials data special?\n\n- Types of data: micrographs, EBSD, EDS, EELS, XRD, process logs, thermal profiles, deformation curves.\n- PSPP (Processing–Structure–Property–Performance) as a data graph.\n- Why vision-based ML and time-series ML are central to processing & characterization.\n\n#### Week 2 – Image formation & the physics of data\n\n- How characterization creates data: resolution, contrast mechanisms, artifacts.\n- Fourier optics intuition for students with their ML-intro foundations.\n- Sampling, aliasing, denoising as model-based priors.\n\n#### Week 3 – Experimental data quality & ML-readiness\n\n- Annotation, segmentation, inter-annotator variance.\n- Train/test leakage in materials workflows.\n\n*(These first three weeks ensure students understand why ML behaves differently on materials data compared to CIFs and DFT databases in the other course.)*\n\n### Unit II — ML for Microstructure: Vision & Representation (Weeks 4–6)\n\n#### Week 4 – Classical microstructure quantification & its ML extension\n\n- Grain size, phase fractions, orientation maps, lineal intercepts.\n- From hand-crafted features → learned representations.\n\n#### Week 5 – Convolutional Neural Networks for microstructure classification\n \n\n- CNN filters as microstructure interpreters.\n- Example tasks: grain-boundary segmentation, precipitate detection, melt pool defects.\n\n\n\n#### Week 6 – Transfer learning & data scarcity in materials characterization\n\n- How to train a model with 200 images instead of 200k.\n- Representations from ImageNet vs self-supervised pretraining on microstructures.\n\n### Unit III — ML in Processing: Time-Series, Optimization, Thermal/Mechanical Data (Weeks 7–9)\n\n#### Week 7 – Process monitoring & time-series ML\n\n- Process logs: temperature cycles, additive manufacturing melt pool monitoring, SPS, rolling, heat treatment.\n- Hidden Markov models, ARIMA, random forest regressors, RNNs (light introduction).\n\n#### Week 8 – Process → structure regression & uncertainty\n\n- Gaussian Processes (synergy with Materials Genomics' surrogate models, but here linked to experimental data).\n- Uncertainty as a tool for process design.\n\n#### Week 9 – Inverse problems in processing\n\n- ML-guided process maps (AM: laser power vs scan speed; metallurgy: TTT/CCT approximations).\n- Physics-informed ML vs naive regression.\n\n*(This unit ensures profoundly processing-centered ML content—distinct from genomics's structure-first world.)*\n\n### Unit IV — ML for Characterization Signals (Weeks 10–12)\n\n#### Week 10 – Spectral data: ML for XRD, EELS, EDS\n\n- Peak detection, denoising, background removal.\n- Dimensionality reduction (PCA, NMF, ICA).\n\n#### Week 11 – ML for microscopy automation\n\n- Auto-focusing, drift correction, parameter selection.\n- Vision-based defect detection in EBSD or TEM.\n\n#### Week 12 – Multi-modal data fusion\n\n- Combining images + spectra + process parameters.\n- Early vs late fusion.\n\n*(This module draws synergy with your research—students love learning real lab-relevant problems.)*\n\n### Unit V — Project + Reflection (Weeks 13–14)\n\n#### Week 13 – Mini-project workshop\n\n**Projects could be:**\n\n- Predict microhardness from heat-treatment + microstructure images.\n- Segment phases in SEM images.\n- Detect porosity in AM melt pool images.\n- Denoise EELS/XRD spectra.\n- Build a process map using Gaussian Processes.\n\n**Students must show:**\n\n1. data prep → 2. model selection → 3. evaluation → 4. uncertainty → 5. interpretation.\n\n#### Week 14 – Presentations + critical evaluation\n\n- Focus on explainability (CAMs, SHAP for simple models).\n- Reflect on why ML sometimes fails on materials data.\n- Wrap-up: Where ML is genuinely changing materials characterization.\n\n## Learning Outcomes\n\nStudents completing this course should be able to:\n\n- Interpret materials characterization and processing data in an ML-ready way.\n- Build ML pipelines for microstructure classification, process prediction, and spectral analysis.\n- Understand the physics of image/signal formation well enough to avoid \"garbage in → garbage out\".\n- Evaluate uncertainty and biases in experimental ML models.\n- Combine processing and characterization data for property prediction.\n- Critically evaluate claims about ML in materials science.\n## \n\n## Lab possibilities:\n\n- **Lab:** Exploring real microscopy datasets; noise, metadata, units.\n- **Lab:** Fourier & wavelet inspection of SEM/TEM/optical micrographs.\n- **Lab:** Correct vs broken experimental ML pipelines; data-leak horror stories.\n- **Lab:** Using scikit-image to extract features; PCA on microstructure descriptors.\n- **Lab:** Fine-tuning a pretrained model on SEM/optical images.\n- **Lab:** Predicting hardness from heat-treatment curves.\n- **Lab:** GP on process parameters (e.g., cooling rate → microstructure metric).\n- **Lab:** Building process maps using ML surrogate models.\n- **Lab:** NMF decomposition of EELS datasets; automatic phase identification in XRD.\n- **Lab:** Implementing a simple \"AI autofocus\" or EBSD pattern classifier.\n- **Lab:** Fusing XRD + microstructure representations for property prediction.\n\n## References {.unnumbered}\n\n::: {#refs}\n:::\n:::\n\n",
66
"supporting": [
77
"index_files/figure-docx"
88
],

0 commit comments

Comments
 (0)