diff --git a/Lectures/Lecture - Non Parametric Analysis/Lecture-MI.ipynb b/Lectures/Lecture - Non Parametric Analysis/Lecture-MI.ipynb
new file mode 100644
index 0000000000000000000000000000000000000000..c3564bdde0537d781db28d5d8f8ce052ee245955
--- /dev/null
+++ b/Lectures/Lecture - Non Parametric Analysis/Lecture-MI.ipynb	
@@ -0,0 +1,259 @@
+{
+ "cells": [
+  {
+   "cell_type": "markdown",
+   "id": "91330533",
+   "metadata": {},
+   "source": [
+    "**What is this?**\n",
+    "\n",
+    "\n",
+    "*This jupyter notebook is part of a collection of notebooks on various topics discussed during the Time Domain Astrophysics course delivered by Stefano Covino at the [Università dell'Insubria](https://www.uninsubria.eu/) in Como (Italy). Please direct questions and suggestions to [stefano.covino@inaf.it](mailto:stefano.covino@inaf.it).*"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "915ee876",
+   "metadata": {},
+   "source": [
+    "**This is a `textual` notebook**"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "53194e25",
+   "metadata": {},
+   "source": [
+    "![Time Domain Astrophysics](Pics/TimeDomainBanner.jpg)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "3a9502d7",
+   "metadata": {},
+   "source": [
+    "# Mutual Information and Time-Series Analysis\n",
+    "***\n",
+    "\n",
+    "> We see here an algorithm introduced by [Huijse et al. (2018)](https://ui.adsabs.harvard.edu/abs/2018ApJS..236...12H/abstract), to identify light curve periods based on quadratic mutual information (MI). We invite any intersted reader to check the original publication for any further detail.\n",
+    "\n",
+    "- This is a non-parametric method since it does not rely on sinusoidal models for the data. \n",
+    "\n",
+    "- Instead, a metric on the phase diagram of the light curve $\\{\\phi_i, m_i\\}_{i=1,\\ldots,N}$ is optimized, where $m_i$, magnitudes, and $\\phi_i$, phases, are obtained from the time instants $t_i$ given a certain trial period $P$ as:\n",
+    "\n",
+    "$$ \\phi_i = \\frac{\\text{mod}(t_i, P)}{P} ~ \\in ~ [0, 1], $$\n",
+    "\n",
+    "- where $\\text{mod}(\\cdot, \\cdot)$ stands for the division remainder operator.\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- We'll make an extensive use of the information theoretic concept of Mutual Information (MI). \n",
+    "    - In a broad sense, MI measures the reduction of the uncertainty of a random variable (RV) given that we know a second RV. MI can also be seen as a measure of dependence although, unlike correlation, MI is able to capture non-linear dependence between RVs. \n",
+    "    - More formally, MI is posed as the divergence (statistical distance) between the joint probability density function (PDF) of the RVs and the product of their marginal PDFs.\n",
+    "\n",
+    "- Several definitions of MI exist in the literature, being Shannon's MI the most well known ([Gray 2023](https://ee.stanford.edu/~gray/it.pdf)). \n",
+    "- Shannon's MI for continuous RVs $X$ and $Y$ with joint PDF $f_{X, Y}(\\cdot, \\cdot)$ is defined as: \n",
+    "\n",
+    "$$   \\text{MI}_S(X, Y) = D_{KL}(f_{X,Y} || f_X f_Y) = \\iint f_{X,Y} \\log f_{X,Y} \\,dx \\,dy - \\int f_{X} \\log f_X  \\,dx  - \\int f_{Y} \\log f_Y \\,dy $$\n",
+    "- where $D_{KL}(\\cdot || \\cdot)$ is the Kullback-Leibler divergence and $f_X (x)= \\int f_{X,Y} (x,y)\\,dy$, $f_Y (y) = \\int f_{X,Y} (x, y)\\,dx$ are the marginal PDFs of $X$ and $Y$, respectively.\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- We intend to avoid the estimation of the PDF by using MI definitions arising from generalized divergences. Such MI estimators have been proposed in the Information Theoretic Learning (ITL, e.g., [Principe 2000](https://link.springer.com/article/10.1023/A:1008143417156)) literature. \n",
+    "- In what follows we present the derivation of two MI definitions for continuous RVs from the ITL framework. \n",
+    "\n",
+    "- Starting from the Euclidean distance between probability density functions:\n",
+    "\n",
+    "$$  D_{ED}(f(x) || g(x)) = \\int (f(x) - g(x))^2 \\,dx,  $$\n",
+    "\n",
+    "- the Euclidean distance Quadratic MI between RVs $X$ and $Y$ is defined as: \n",
+    "\n",
+    "$$ \\text{QMI}_{ED}(X, Y) = D_{ED}(f_{X,Y} (x,y)|| f_{X}(x) f_{Y}(y)) = $$\n",
+    "$$ = \\iint  f_{X,Y}^2 \\,dx \\,dy - 2 \\iint f_{X,Y} f_X f_Y \\,dx \\,dy +  \\int f_X^2 \\,dx \\int f_Y ^2 \\,dy = V_J - 2 V_C + V_M $$\n",
+    "\n",
+    "- where $f_{X,Y}(\\cdot, \\cdot)$ is the joint PDF of $X$ and $Y$ while $f_X(\\cdot)$ and $f_Y(\\cdot)$ are the marginal PDFs, respectively.\n",
+    "\n",
+    "- The terms $V_J$, $V_M$ and $V_C$ correspond to the integrals of the squared joint PDF, squared product of the marginal PDFs and product of joint PDF and marginal PDFs, respectively. \n",
+    "\n",
+    "> The ITL framework provides an estimator of these quantities that can be computed directly from data samples. This estimator is called the Information Potential (IP) of an RV and it corresponds to the expected value of its PDF. \n",
+    "\n",
+    "- Note that the expected value of a PDF is equivalent to the integral of the squared PDF.\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- In ITL a strong emphasis is given to the estimation of these quantities directly from data in a non-parametric way. \n",
+    "\n",
+    "- As an example consider the ITL estimation of the so-called [Renyi](https://en.wikipedia.org/wiki/R%C3%A9nyi_entropy)'s second order generalization $H_{2} (X)$ of Shannon's entropy of a continuous RV: \n",
+    "\n",
+    "$$ H_{2} (X) = - \\log \\int f_X(x)^2 \\,dx, $$\n",
+    "\n",
+    "- where $f_X(x)$ is the RV's PDF. \n",
+    "\n",
+    "- Assuming that we have $\\{x_i\\}_{i=1,\\dots,N}$ realizations of the RV its PDF can be computed using a kernel density estimator (KDE):\n",
+    "\n",
+    "$$ f_X(x) = \\frac{1}{N} \\sum_{i=1}^N \\text{G}_h \\left( x-x_i\\right) = \\frac{1}{N\\sqrt{2\\pi}h} \\sum_{i=1}^N \\exp \\left( \\frac{\\|x-x_i\\|^2}{2h^2} \\right), $$\n",
+    "\n",
+    "- where $\\text{G}_h(\\cdot)$ is the Gaussian kernel with bandwidth $h$.\n",
+    "\n",
+    "- Using the Gaussian convolution property, i.e. the convolution of two Gaussian functions is also a Gaussian, we obtain:\n",
+    "\n",
+    "$$ H_{2} (X) = - \\log \\frac{1}{N^2} \\int  \\sum_{i=1}^N \\sum_{j=1}^N  \\text{G}_h \\left( x-x_i \\right)  \\text{G}_h \\left(x-x_j \\right) \\,dx = - \\log \\frac{1}{N^2}  \\sum_{i=1}^N \\sum_{j=1}^N  \\text{G}_{\\sqrt{2}h} \\left( x_i-x_j \\right) = - \\log \\text{IP}_X, $$\n",
+    "\n",
+    "- where $\\text{IP}_X$ is the Information Potential (IP), an estimator of the expected value of the PDF of $X$ estimated directly from the data samples bypassing the estimation of the PDF. \n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- Assuming that we have $\\{x_i, y_i\\}_{i=1,\\ldots,N}$ i.i.d. realizations of RVs $X$ and $Y$ and using the IP estimator we get:\n",
+    "$$     V_M = \\text{IP}_X \\text{IP}_Y = \\left (\\frac{1}{N^2} \\sum_{i,j=1}^{N,N}  \\text{G}_{\\sqrt{2}h} \\left( x_i-x_j \\right) \\right) \\left ( \\frac{1}{N^2} \\sum_{i,j=1}^{N,N} \\text{G}_{\\sqrt{2}h} \\left( y_i-y_j \\right) \\right), $$\n",
+    "\n",
+    "$$ V_J = \\text{IP}_{X,Y} = \\frac{1}{N^2} \\sum_{i=1}^{N} \\sum_{j=1}^{N} \\text{G}_{\\sqrt{2}h} \\left( x_i-x_j \\right)  \\text{G}_{\\sqrt{2}h} \\left( y_i-y_j \\right), $$\n",
+    "\n",
+    "$$  V_C = \\text{IP}_{X\\times Y} = \\frac{1}{N} \\sum_{i=1}^{N}  \\left ( \\frac{1}{N}\\sum_{j=1}^{N} \\text{G}_{\\sqrt{2}h} \\left( x_i-x_j \\right) \\right) \\left ( \\frac{1}{N} \\sum_{j=1}^{N} \\text{G}_{\\sqrt{2}h} \\left( y_i-y_j \\right) \\right), $$\n",
+    "\n",
+    "-where $\\text{G}_{h} \\left( x \\right) = \\frac{1}{\\sqrt{2\\pi}h} \\exp \\left( \\frac{\\|x\\|^2}{2h^2} \\right)$ is the Gaussian kernel with bandwidth $h$. \n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- The second ITL quadratic MI that we consider is obtained by defining a divergence measure based on the Cauchy-Schwarz inequality:\n",
+    "\n",
+    "$$ D_{CS}(f(x) || g(x)) = -\\log \\frac{\\left(\\int f(x)g(x) \\,dx\\right)^2}{\\int f(x)^2 \\,dx \\int g(x)^2 \\,dx}, $$\n",
+    "\n",
+    "- then the Cauchy-Schwarz Quadratic MI for continuous RVs $X$ and $Y$ becomes:\n",
+    "\n",
+    "$$ \\text{QMI}_{CS}(X, Y) = D_{CS}(f_{X,Y} (x,y)|| f_{X}(x) f_{Y}(y)) = $$\n",
+    "$$ = \\log \\iint f_{X,Y}^2 \\,dx \\,dy -2 \\log \\iint  f_{X,Y} f_X f_Y \\,dx \\,dy + $$\n",
+    "$$ + \\log \\int f_X^2 \\,dx \\int f_Y ^2 \\,dy = \\log V_J - 2 \\log V_C + \\log V_M, $$"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "9208993d",
+   "metadata": {},
+   "source": [
+    "## Period Estimation by Maximizing Mutual Information\n",
+    "***\n",
+    "\n",
+    "- A typical analysis starts by applying the epoch folding transformation for a certain trial period to the (possibly) unevenly sampled time-series to obtain the phase diagram $\\{\\phi_i, m_i, \\sigma_i\\}_{i=1,\\ldots,N}$. \n",
+    "- We assume that the light curve is periodic with an unknown period. The phases $\\{\\phi_i\\}$ correspond to our non-parametric model of the periodicity, while $\\{m_i\\}$ correspond to our noisy observations. As usual $\\{\\sigma_i\\}$ are the estimated errors on our observations. \n",
+    "- If the light curve is periodic with period $P_T$, then folding with this period will yield the model that best explains our observations. This can be measured by calculating the MI between phases and magnitudes, i.e. the amount of information shared by model and observation. \n",
+    "- We can test several models (foldings) and find the one which maximizes MI to detect the best period. \n",
+    "    - Notice that MI requires independent and identically distributed (iid) realizations of the RVs. Although light curves are time series and hence there exist serial correlations in time, these correlations are broken in the phase diagram.\n",
+    "    - Phase is a function of time and period, and several periods are tested per light curve. If the period is not related to the underlying periodicity of the data the phase diagram is filled uniformly and serial correlations in the joint space are broken. \n",
+    " \n",
+    "- A second interpretation on using MI for periodicity detection lays on MI's definition as the divergence (statistical distance) between the joint PDF and the marginal PDF of the RVs. \n",
+    "    - If the light curve is folded with a wrong period, the structure in the joint PDF will be almost equal to the product of the marginal PDFs, i.e. magnitudes (or fluxes) are independent of the phases. \n",
+    "    - On the other hand, if the correct period is chosen the joint PDF will present structure that is not captured by the product of the marginals. By maximizing MI we are maximizing the dependency between model and observations.\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- Let's denote $M$ and $\\Phi$ as the RVs associated to magnitude and phase, respectively. We can estimate the PDF of $M$ given its realizations using KDE as follows:\n",
+    "\n",
+    "$$ f_M(m) = \\frac{1}{N} \\sum_{i=1}^N \\text{G}_{\\sqrt{\\sigma_i^2+h_m^2}}(m-m_i) = \\frac{1}{N} \\sum_{i=1}^N \\frac{1}{\\sqrt{2 \\pi (\\sigma_i^2 + h_m^2)}} \\exp \\left( - \\frac{1}{2} \\frac{(m - m_i )^2}{(\\sigma_i^2 + h_m^2)} \\right), $$\n",
+    "- where each sample $m_i$ has a bandwidth that incorporates the KDE bandwidth $h_m$ and its given uncertainty $\\sigma_i$. \n",
+    "\n",
+    "- As $\\Phi$ is a periodic RV we need a periodic kernel to appropriately estimate its PDF. We consider a kernel arising from the Wrapped Cauchy (WC) distribution \\citep{jammalamadaka2001topics} and estimate $\\Phi$'s PDF as:\n",
+    "\n",
+    "$$ f_\\Phi(\\phi) = \\frac{1}{N} \\sum_{i=1}^N \\text{WC}_{h_\\phi}(\\phi-\\phi_i)  = \\frac{1}{2 \\pi N} \\sum_{i=1}^N \\frac{1 - e^{-2 h_\\phi}}{1 + e^{-2 h_\\phi} - 2 e^{- h_\\phi} \\cos(2\\pi (\\phi - \\phi_i))}, $$\n",
+    "- where $h_\\phi \\in (0, \\infty)$ is the scale of the Cauchy distribution. \n",
+    "\n",
+    "- For $h_\\phi \\to \\infty$ the WC kernel behaves like the circular uniform distribution, while for $h_\\phi \\to 0$ it concentrates on its mean. \n",
+    "\n",
+    "- The joint PDF of $\\Phi$ and $M$ is estimated as:\n",
+    "\n",
+    "$$ f_{\\Phi, M}(\\phi, m) = \\frac{1}{N} \\sum_{i=1}^N \\text{G}_{\\sqrt{\\sigma_i^2+h_m^2}}(m-m_i) \\cdot \\text{WC}_{h_\\phi}(\\phi-\\phi_i), $$\n",
+    "- because the multiplication of valid kernel functions is also a kernel.\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- Using the Gaussian kernel for the magnitudes (or fluxes) and the WC kernel for phases we obtain:\n",
+    "$$   \\text{IP}_M = \\frac{1}{N^2} \\sum_{i=1}^N \\sum_{j=1}^{N}  \\text{G}_{\\sqrt{2h_m^2 + \\sigma_i^2+ \\sigma_j^2}} \\left( m_i-m_j \\right), $$\n",
+    "$$    \\text{IP}_\\Phi = \\frac{1}{N^2} \\sum_{i=1}^N \\sum_{j=1}^{N}  \\text{WC}_{2 h_\\phi} \\left( \\phi_i-\\phi_j \\right), $$\n",
+    "$$    \\text{IP}_{\\Phi,M} = \\frac{1}{N^2} \\sum_{i=1}^N \\sum_{j=1}^{N}  \\text{G}_{\\sqrt{2h_m^2 + \\sigma_i^2+ \\sigma_j^2}} \\left( m_i-m_j \\right) \\text{WC}_{2 h_\\phi} \\left( \\phi_i-\\phi_j \\right), $$\n",
+    "\n",
+    "- and therefore:\n",
+    "$$    \\text{IP}_{\\Phi \\times M} = \\frac{1}{N} \\sum_{i=1}^{N}  \\left ( \\frac{1}{N}\\sum_{j=1}^{N}  \\text{G}_{\\sqrt{2h_m^2 + \\sigma_i^2+ \\sigma_j^2}} \\left( m_i-m_j \\right) \\right) \\left ( \\frac{1}{N} \\sum_{j=1}^{N} \\text{WC}_{2 h_\\phi} \\left( \\phi_i-\\phi_j \\right) \\right), $$\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- Through these potentials we restate the QMI estimators as:\n",
+    "$$ \\text{QMI}_{ED}(\\Phi, M) =   \\text{IP}_{\\Phi,M} - 2 \\text{IP}_{\\Phi \\times M} + \\text{IP}_\\Phi \\text{IP}_M, $$\n",
+    "$$ \\text{QMI}_{CS}(\\Phi, M) =   \\log \\text{IP}_{\\Phi,M} - 2 \\log \\text{IP}_{\\Phi \\times M} + \\log \\text{IP}_\\Phi  + \\log \\text{IP}_M, $$\n",
+    "\n",
+    "> If the period of a light curve is estimated by maximizing the QMI for a range of trial periods. This yields a QMI periodogram!\n",
+    "\n",
+    "<br>\n",
+    "\n",
+    "- An interesting problem by itself is the choice of the KDE bandwidth. In this case we have two parameters $h_\\phi$ and $h_m$. \n",
+    "    - The former is associated to the phases which are always constrained to $[0, 2\\pi]$, i.e. the dynamic range of this variable is fixed. QMI is not too sensitive to $h_\\phi$ as long as it is not extremely small or large. It was found empirically that $h_\\phi = 1$ is a good choice and we keep it constant to make comparisons between QMI values easier. \n",
+    "    - The second bandwidth $h_m$ is more difficult to set as the dynamic range of the magnitudes is not known \\emph{a priori}. Following [Silverman (1986)](https://www.taylorfrancis.com/books/mono/10.1201/9781315140919/density-estimation-statistics-data-analysis-bernard-silverman), we may write:\n",
+    "$$ h_m = 0.9 \\cdot \\text{min} ( \\sqrt{\\text{VAR}[m]}, ~\\text{IQR}[m]/1.349) \\cdot N^{-1/5}, $$\n",
+    "- where $\\text{VAR}[m]$ is the variance of the magnitudes, $\\text{IQR}[m]$ is the interquartile range of the magnitudes and $N$ is the number of samples. To avoid overestimation of $h_m$ we use the weighted versions of variance and IQR, with weights $w_i = \\sigma_i^{-2}$, $i = 1,\\ldots,N$. "
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "7f29ed0d-ddc3-4e16-beab-721f8e09079c",
+   "metadata": {},
+   "source": [
+    "## Reference & Material\n",
+    "\n",
+    "Material and papers related to the topics discussed in this lecture.\n",
+    "\n",
+    "- [Huijse et al. (2018) - Robust Period Estimation Using Mutual Information for Multiband Light Curves in the Synoptic Survey](https://ui.adsabs.harvard.edu/abs/2018ApJS..236...12H/abstract)"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "05e93b1d",
+   "metadata": {},
+   "source": [
+    "## Course Flow\n",
+    "\n",
+    "<table>\n",
+    "  <tr>\n",
+    "    <td>Previous lecture</td>\n",
+    "    <td>Next lecture</td>\n",
+    "  </tr>\n",
+    "  <tr>\n",
+    "    <td><a href=\"../Lecture%20-%20Non%20Parametric%20Analysis/Lecture-NonParametricAnalysis.ipynb\">Non-parametric analysis</a></td>\n",
+    "    <td><a href=\"../Lecture%20-%20Non%20Parametric%20Analysis/Lecture-NonParametricAnalysis.ipynb\">Non-parametric analysis</a></td>\n",
+    "  </tr>\n",
+    " </table>\n",
+    "\n"
+   ]
+  },
+  {
+   "cell_type": "markdown",
+   "id": "591bd355",
+   "metadata": {},
+   "source": [
+    "**Copyright**\n",
+    "\n",
+    "This notebook is provided as [Open Educational Resource](https://en.wikipedia.org/wiki/Open_educational_resources). Feel free to use the notebook for your own purposes. The text is licensed under [Creative Commons Attribution 4.0](https://creativecommons.org/licenses/by/4.0/), the code of the examples, unless obtained from other properly quoted sources, under the [MIT license](https://opensource.org/licenses/MIT). Please attribute the work as follows: *Stefano Covino, Time Domain Astrophysics - Lecture notes featuring computational examples, 2024*."
+   ]
+  },
+  {
+   "cell_type": "code",
+   "execution_count": null,
+   "id": "2c3205e6-4df5-403a-b175-f24e35028051",
+   "metadata": {},
+   "outputs": [],
+   "source": []
+  }
+ ],
+ "metadata": {
+  "kernelspec": {
+   "display_name": "Julia 1.11.3",
+   "language": "julia",
+   "name": "julia-1.11"
+  },
+  "language_info": {
+   "file_extension": ".jl",
+   "mimetype": "application/julia",
+   "name": "julia",
+   "version": "1.11.3"
+  }
+ },
+ "nbformat": 4,
+ "nbformat_minor": 5
+}
diff --git a/Lectures/Lecture - Non Parametric Analysis/Lecture-NonParametricAnalysis.ipynb b/Lectures/Lecture - Non Parametric Analysis/Lecture-NonParametricAnalysis.ipynb
index 7a11020754dcd7edf7e072fd5c4a98d92f6dac62..f47c9979a99f68077f9852ef29ff36dab54545bc 100644
--- a/Lectures/Lecture - Non Parametric Analysis/Lecture-NonParametricAnalysis.ipynb	
+++ b/Lectures/Lecture - Non Parametric Analysis/Lecture-NonParametricAnalysis.ipynb	
@@ -587,7 +587,9 @@
   {
    "cell_type": "markdown",
    "id": "3a65b5e4",
-   "metadata": {},
+   "metadata": {
+    "jp-MarkdownHeadingCollapsed": true
+   },
    "source": [
     "## Mutual Information\n",
     "\n",
@@ -642,7 +644,9 @@
     "- If the light curve is folded with the wrong period, the structure in the joint PDF will be almost equal to the product of the marginal PDFs, i.e., magnitudes are independent of the phases. \n",
     "- On the other hand, if the correct period is chosen, the joint PDF will present a structure that is not captured by the product of the marginals. By maximizing MI we are maximizing the dependency between model and observations.\n",
     "\n",
-    "![Simulated light curve and MI periodogram](Pics/MISimandPeriodogram.png)"
+    "![Simulated light curve and MI periodogram](Pics/MISimandPeriodogram.png)\n",
+    "\n",
+    "- A more detailed analysis of the math behind MI application in Time-Series analysis can be found [here](Lecture-MI.ipynb)."
    ]
   },
   {
@@ -719,7 +723,7 @@
  ],
  "metadata": {
   "kernelspec": {
-   "display_name": "Julia 1.11.1",
+   "display_name": "Julia 1.11.3",
    "language": "julia",
    "name": "julia-1.11"
   },
@@ -727,7 +731,7 @@
    "file_extension": ".jl",
    "mimetype": "application/julia",
    "name": "julia",
-   "version": "1.11.1"
+   "version": "1.11.3"
   }
  },
  "nbformat": 4,
diff --git a/Lectures/Lecture - Spectral Analysis/Lecture-SpectralAnalysis.ipynb b/Lectures/Lecture - Spectral Analysis/Lecture-SpectralAnalysis.ipynb
index 135e4be37d8cba8966eba59cd53dcc37d22b7883..a7835de4d913f286089e0a59269846101c8cf9b8 100644
--- a/Lectures/Lecture - Spectral Analysis/Lecture-SpectralAnalysis.ipynb	
+++ b/Lectures/Lecture - Spectral Analysis/Lecture-SpectralAnalysis.ipynb	
@@ -271,7 +271,7 @@
     "id": "4367515f"
    },
    "source": [
-    "### Exercize about an nalysis of weather data in France\n",
+    "### Exercize about an analysis of weather data in France\n",
     "***\n",
     "\n",
     "- In the following exercize, we are going to analyse weather data spanning about 20 years in France obtained from the US National Climatic Data Center.\n",
diff --git a/README.md b/README.md
index 4a0c687c7bbb30828715b0159e6a1b8f73dad993..094017d377e9f378acb08f81b49cc9bbe24e7da3 100644
--- a/README.md
+++ b/README.md
@@ -2,4 +2,4 @@
 
 This is a repository with material (notebooks, papers, etc.) for the **Time Domain Astrophysics** course delivered at the *Università dell'Insubria* by Stefano Covino. 
 
-*Last update: 13 January 2025.*
+*Last update: 31 January 2025.*