Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
dd739f0
pydantic -- fix imports
kkappler Nov 21, 2025
e27f34b
Update processing_configuration_template.json
kujaku11 Nov 23, 2025
4b740c2
stop auto testing until we address all tests locally.
kujaku11 Nov 23, 2025
714bcf6
Update spectrogram_helpers.py
kujaku11 Nov 28, 2025
1eff691
Update test_issue_139.py
kujaku11 Dec 2, 2025
0a14739
Update config_creator.py
kujaku11 Dec 2, 2025
d0bbde0
updating precommit
kujaku11 Dec 2, 2025
a2c9e6a
Add pytest fixtures and test for TF zrr file roundtrip
kujaku11 Dec 3, 2025
67ee871
Create test_transfer_function_kernel_pytest.py
kujaku11 Dec 3, 2025
056955d
Add synthetic pytest tests and improve test fixtures
kujaku11 Dec 3, 2025
c1473cb
Align num_samples_window in frequency band tests
kujaku11 Dec 3, 2025
543e6bf
Refactor synthetic test MTH5 file creation for test isolation
kujaku11 Dec 3, 2025
0d12513
Migrate synthetic tests from unittest to pytest
kujaku11 Dec 5, 2025
6b51c30
Replace transfer function kernel tests with windowing tests
kujaku11 Dec 5, 2025
a43f6dd
Add pytest suite for xarray_helpers module
kujaku11 Dec 5, 2025
0c2a8da
Remove time series test files
kujaku11 Dec 5, 2025
4d38ac7
Add pytest suite for cross_power transfer function
kujaku11 Dec 5, 2025
3e80920
Add regression tests for helper_functions and remove cross_power tests
kujaku11 Dec 5, 2025
72e11bd
Add regression tests for RegressionEstimator base class
kujaku11 Dec 5, 2025
03627ee
Remove regression test files for transfer function
kujaku11 Dec 5, 2025
e5b55fa
Add comprehensive Parkfield pytest suite and fixtures
kujaku11 Dec 5, 2025
477998f
Update tests.yaml
kujaku11 Dec 5, 2025
cd3f7dc
Refactor feature and channel attribute usage
kujaku11 Dec 5, 2025
5cc6d4b
Update tests.yaml
kujaku11 Dec 5, 2025
403c0e2
Update dataset example for Windows paths and metadata
kujaku11 Dec 5, 2025
4d35d16
skipping notebooks for now
kujaku11 Dec 5, 2025
862ec08
Fix config save and update test signatures
kujaku11 Dec 5, 2025
56ab23a
fix filter additons to use new add_filter method
kkappler Dec 6, 2025
61bb118
force run_id in metadata
kkappler Dec 6, 2025
1098555
update python version info, add some pytest helpers
kkappler Dec 6, 2025
7cf3ae2
Refactor Parkfield tests and fixtures for clarity and robustness
kujaku11 Dec 6, 2025
19eef50
Update test_parkfield_pytest.py
kujaku11 Dec 6, 2025
05a6744
Improve plot handling and test comments, update warnings
kujaku11 Dec 6, 2025
cd6de3c
Improve matplotlib backend handling and test data setup
kujaku11 Dec 8, 2025
d2004fe
Refactor tests to reuse processed transfer functions
kujaku11 Dec 8, 2025
f97161b
Set fixture scope to class for kernel dataset tests
kujaku11 Dec 8, 2025
a733572
Remove obsolete Parkfield test scripts
kujaku11 Dec 8, 2025
e077b6f
Update processing_configuration_template.json
kujaku11 Dec 8, 2025
19d505c
Optimize synthetic test suite with class-scoped fixtures
kujaku11 Dec 8, 2025
ed4d66e
Add ZFile transfer function comparison utilities
kujaku11 Dec 9, 2025
1f4c864
updating how survey metadata is filled
kujaku11 Dec 10, 2025
fd3e9b4
changed default of None to 1
kujaku11 Dec 10, 2025
ffb37fc
Update edf_weights.py
kujaku11 Dec 11, 2025
438ba49
fixing bugs with feature weighting
kujaku11 Dec 12, 2025
f20b578
Update feature_weights.py
kujaku11 Dec 12, 2025
861bf3b
updating logging messages
kujaku11 Dec 12, 2025
f324408
Use persistent cache for Parkfield MTH5 test data
kujaku11 Dec 17, 2025
90d1bd0
Add vectorized pass_band optimization and analysis docs
kujaku11 Dec 17, 2025
784a2d1
Update test_parkfield_pytest.py
kujaku11 Dec 17, 2025
e1727f1
Improve survey metadata handling in TransferFunctionKernel
kujaku11 Dec 17, 2025
201ebfa
removing sandbox test files.
kujaku11 Dec 17, 2025
a79c49d
Add discrete Fourier Coefficients synthetic tests
kujaku11 Dec 18, 2025
f739df7
Enhance synthetic FC tests and add error handling
kujaku11 Dec 20, 2025
64e4894
Refactor and parametrize Fourier Coefficients tests
kujaku11 Dec 20, 2025
bd83c89
Add pytest suite for MATLAB Z-file reader
kujaku11 Dec 21, 2025
d79a21e
Update tests.yaml
kujaku11 Dec 21, 2025
ab00081
Enable parallel test execution and add test dependencies
kujaku11 Dec 21, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
30 changes: 15 additions & 15 deletions .github/workflows/tests.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ jobs:
fail-fast: false
matrix:
os: ["ubuntu-latest"]
python-version: [3.9, "3.10", "3.11", "3.12"]
python-version: ["3.10", "3.11", "3.12"]

steps:
- uses: actions/checkout@v4
Expand All @@ -35,31 +35,31 @@ jobs:
run: |
uv venv --python ${{ matrix.python-version }}
uv pip install -e ".[dev,test]"
uv pip install "mt_metadata[obspy] @ git+https://github.com/kujaku11/mt_metadata.git"
uv pip install git+https://github.com/kujaku11/mth5.git
uv pip install "mt_metadata[obspy] @ git+https://github.com/kujaku11/mt_metadata.git@pydantic"
uv pip install git+https://github.com/kujaku11/mth5.git@old_pydantic
uv pip install jupyter ipykernel pytest pytest-cov codecov

- name: Install system dependencies
run: |
sudo apt-get update
sudo apt-get install -y pandoc

- name: Execute Jupyter Notebooks
run: |
source .venv/bin/activate
python -m ipykernel install --user --name aurora-test
jupyter nbconvert --to notebook --execute docs/examples/dataset_definition.ipynb
jupyter nbconvert --to notebook --execute docs/examples/operate_aurora.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/pkd_units_check.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/pole_zero_fitting/lemi_pole_zero_fitting_example.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/processing_configuration.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/process_cas04_multiple_station.ipynb
jupyter nbconvert --to notebook --execute docs/tutorials/synthetic_data_processing.ipynb
# - name: Execute Jupyter Notebooks
# run: |
# source .venv/bin/activate
# python -m ipykernel install --user --name aurora-test
# jupyter nbconvert --to notebook --execute docs/examples/dataset_definition.ipynb
# jupyter nbconvert --to notebook --execute docs/examples/operate_aurora.ipynb
# jupyter nbconvert --to notebook --execute docs/tutorials/pkd_units_check.ipynb
# jupyter nbconvert --to notebook --execute docs/tutorials/pole_zero_fitting/lemi_pole_zero_fitting_example.ipynb
# jupyter nbconvert --to notebook --execute docs/tutorials/processing_configuration.ipynb
# jupyter nbconvert --to notebook --execute docs/tutorials/process_cas04_multiple_station.ipynb
# jupyter nbconvert --to notebook --execute docs/tutorials/synthetic_data_processing.ipynb

- name: Run Tests
run: |
source .venv/bin/activate
pytest -s -v --cov=./ --cov-report=xml --cov=aurora
pytest -s -v --cov=./ --cov-report=xml --cov=aurora -n auto tests
# pytest -s -v tests/synthetic/test_fourier_coefficients.py
# pytest -s -v tests/config/test_config_creator.py

Expand Down
49 changes: 42 additions & 7 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,10 +1,45 @@
# .pre-commit-config.yaml
repos:
- repo: https://github.com/ambv/black
rev: 22.6.0
- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.4.0
hooks:
- id: black
language_version: python3.10
- repo: https://github.com/pycqa/flake8
rev: 3.9.2
- id: trailing-whitespace
types: [python]
- id: end-of-file-fixer
types: [python]
- id: check-yaml
exclude: '^(?!.*\.py$).*$'

- repo: https://github.com/pycqa/isort
rev: 5.12.0
hooks:
- id: flake8
- id: isort
types: [python]
exclude: (__init__.py)$
files: \.py$
args: ["--profile", "black",
"--skip-glob","*/__init__.py",
"--force-alphabetical-sort-within-sections",
"--order-by-type",
"--lines-after-imports=2"]

- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
types: [python]
files: \.py$
language_version: python3

- repo: https://github.com/pycqa/autoflake
rev: v2.1.1
hooks:
- id: autoflake
types: [python]
files: \.py$
args: [
"--remove-all-unused-imports",
"--expand-star-imports",
"--ignore-init-module-imports",
"--in-place"
]
2 changes: 1 addition & 1 deletion aurora/__init__.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
"sink": sys.stdout,
"level": "INFO",
"colorize": True,
"format": "<level>{time} | {level: <3} | {name} | {function} | {message}</level>",
"format": "<level>{time} | {level: <3} | {name} | {function} | line: {line} | {message}</level>",
},
],
"extra": {"user": "someone"},
Expand Down
9 changes: 8 additions & 1 deletion aurora/config/config_creator.py
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@
from aurora.config.metadata import Processing
from aurora.sandbox.io_helpers.emtf_band_setup import EMTFBandSetupFile
from mth5.processing.kernel_dataset import KernelDataset
from mt_metadata.transfer_functions.processing.window import Window
from mt_metadata.processing.window import Window

import pathlib

Expand Down Expand Up @@ -127,6 +127,7 @@ def create_from_kernel_dataset(
kernel_dataset: KernelDataset,
input_channels: Optional[list] = None,
output_channels: Optional[list] = None,
remote_channels: Optional[list] = None,
estimator: Optional[str] = None,
emtf_band_file: Optional[Union[str, pathlib.Path]] = None,
band_edges: Optional[dict] = None,
Expand Down Expand Up @@ -166,6 +167,8 @@ def create_from_kernel_dataset(
List of the input channels that will be used in TF estimation (usually "hx", "hy")
output_channels: list
List of the output channels that will be estimated by TF (usually "ex", "ey", "hz")
remote_channels: list
List of the remote reference channels (usually "hx", "hy" at remote site)
estimator: Optional[Union[str, None]]
The name of the regression estimator to use for TF estimation.
emtf_band_file: Optional[Union[str, pathlib.Path, None]]
Expand Down Expand Up @@ -241,6 +244,10 @@ def create_from_kernel_dataset(
else:
decimation_obj.output_channels = output_channels

if remote_channels is None:
if kernel_dataset.remote_channels is not None:
decimation_obj.reference_channels = kernel_dataset.remote_channels

if num_samples_window is not None:
decimation_obj.stft.window.num_samples = num_samples_window[key]
# set estimator if provided as kwarg
Expand Down
8 changes: 4 additions & 4 deletions aurora/config/metadata/processing.py
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
# -*- coding: utf-8 -*-
"""
Extend the mt_metadata.transfer_functions.processing.aurora.processing.Processing class
Extend the mt_metadata.processing.aurora.processing.Processing class
with some aurora-specific methods.
"""

Expand All @@ -10,10 +10,10 @@

from aurora.time_series.windowing_scheme import window_scheme_from_decimation
from loguru import logger
from mt_metadata.transfer_functions.processing.aurora.processing import (
from mt_metadata.processing.aurora.processing import (
Processing as AuroraProcessing,
)
from mt_metadata.utils.list_dict import ListDict
from mt_metadata.common.list_dict import ListDict
from typing import Optional, Union

import json
Expand Down Expand Up @@ -192,7 +192,7 @@ class EMTFTFHeader(ListDict):
def __init__(self, **kwargs):
"""
Parameters
_local_station : mt_metadata.transfer_functions.tf.station.Station()
_local_station : mt_metadata.processing.tf.station.Station()
Station metadata object for the station to be estimated (
location, channel_azimuths, etc.)
_remote_station: same object type as local station
Expand Down
Loading
Loading