Skip to content

Commit 1a1e7a0

Browse files
Simplify dependencies, release prep for 3.1 (#221)
* Update authors, changelog * Add Python 3.12 to the pyproj classifiers * Bump the version to 3.1.0 * Simplify dependencies. Make statsmodels optional. * Add note, run statsmodels tests if available. * Remove unneeded "pass" statements, unnecessary whitespace * Add Python 3.12 to the hatch test matrix * Update changelog
1 parent b0fcb50 commit 1a1e7a0

18 files changed

+55
-89
lines changed

.bumpversion.cfg

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
[bumpversion]
2-
current_version = 3.0.3
2+
current_version = 3.1.0
33
commit = True
44
tag = True
55

AUTHORS.rst

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -5,3 +5,4 @@ Authors
55
* Bas des Tombe - https://github.com/bdestombe
66
* Bart Schilperoort - https://github.com/BSchilperoort
77
* Karl Lapo - https://github.com/klapo
8+
* David Rautenberg - https://github.com/davechris

CHANGELOG.rst

Lines changed: 12 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -1,18 +1,25 @@
11

22
Changelog
33
=========
4-
3.0.4 (2024-08-30)
4+
3.1.0 (2024-09-13)
55
---
66

7+
Added
8+
9+
* support for Python 3.12.
10+
* AP sensing .tra support, as the reference temperature sensor data by this device in only logged in .tra and not in the .xml log files.
11+
added functions in io/apsensing.py to read .tra files if they are in the same directory as the .xml files.
12+
* more test data from AP sensing device N4386B, which do also contain their .tra log files
13+
714
Fixed
815

916
* device ID bug for APSensing. Device ID is N4386B instead of C320. C320 was an arbitrary name given for the wellbore by the user.
1017

11-
Added
18+
Changed
19+
20+
* the `verify_timedeltas` keyword argument is now optional when merging two single ended datasets.
21+
* removed `statsmodels` as a dependency. It is now optional, and only used for testing the `wls_sparse` solver
1222

13-
* more test data from AP sensing device N4386B, which do also contain their .tra log files
14-
* AP sensing .tra support, as the reference temperature sensor data by this device in only logged in .tra and not in the .xml log files.
15-
added functions in io/apsensing.py to read .tra files if they are in the same directory as the .xml files.
1623

1724
3.0.3 (2024-04-18)
1825
---

CITATION.cff

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -22,5 +22,5 @@ doi: "10.5281/zenodo.1410097"
2222
license: "BSD-3-Clause"
2323
repository-code: "https://github.com/dtscalibration/python-dts-calibration"
2424
title: "Python distributed temperature sensing calibration"
25-
version: "v3.0.3"
25+
version: "v3.1.0"
2626
url: "https://python-dts-calibration.readthedocs.io"

CONTRIBUTING.rst

Lines changed: 0 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -53,7 +53,6 @@ To set up `python-dts-calibration` for local development:
5353

5454
pip install -e ".[dev]"
5555

56-
5756
4. When you're done making changes, make sure the code follows the right style, that all tests pass, and that the docs build with the following commands::
5857

5958
hatch run format

README.rst

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -29,9 +29,9 @@ Overview
2929
:alt: PyPI Package latest release
3030
:target: https://pypi.python.org/pypi/dtscalibration
3131

32-
.. |commits-since| image:: https://img.shields.io/github/commits-since/dtscalibration/python-dts-calibration/v3.0.3.svg
32+
.. |commits-since| image:: https://img.shields.io/github/commits-since/dtscalibration/python-dts-calibration/v3.1.0.svg
3333
:alt: Commits since latest release
34-
:target: https://github.com/dtscalibration/python-dts-calibration/compare/v1.1.1...main
34+
:target: https://github.com/dtscalibration/python-dts-calibration/compare/v3.1.0...main
3535

3636
.. |wheel| image:: https://img.shields.io/pypi/wheel/dtscalibration.svg
3737
:alt: PyPI Wheel

docs/conf.py

Lines changed: 1 addition & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -40,14 +40,13 @@
4040
spelling_show_suggestions = True
4141
spelling_lang = "en_US"
4242

43-
4443
source_suffix = [".rst", ".md"]
4544
master_doc = "index"
4645
project = "dtscalibration"
4746
year = str(date.today().year)
4847
author = "Bas des Tombe and Bart Schilperoort"
4948
copyright = f"{year}, {author}"
50-
version = release = "3.0.3"
49+
version = release = "3.1.0"
5150

5251
pygments_style = "trac"
5352
templates_path = [".", sphinx_autosummary_accessors.templates_path]

pyproject.toml

Lines changed: 7 additions & 8 deletions
Original file line numberDiff line numberDiff line change
@@ -19,7 +19,7 @@ disable = true # Requires confirmation when publishing to pypi.
1919

2020
[project]
2121
name = "dtscalibration"
22-
version = "3.0.3"
22+
version = "3.1.0"
2323
description = "Load Distributed Temperature Sensing (DTS) files, calibrate the temperature and estimate its uncertainty."
2424
readme = "README.rst"
2525
license = "BSD-3-Clause"
@@ -48,19 +48,17 @@ classifiers = [
4848
"Programming Language :: Python :: 3.9",
4949
"Programming Language :: Python :: 3.10",
5050
"Programming Language :: Python :: 3.11",
51+
"Programming Language :: Python :: 3.12",
5152
"Topic :: Utilities",
5253
]
5354
dependencies = [
54-
"numpy>=1.22.4, <=2.0.1", # >= 1.22 for quantile method support in xarray. https://github.com/statsmodels/statsmodels/issues/9333
55-
"dask",
55+
"numpy",
56+
"xarray[accel]",
57+
"dask[distributed]",
5658
"pandas",
57-
"xarray[parallel]", # numbagg (llvmlite) is a pain to install with pip
58-
"bottleneck", # speeds up Xarray
59-
"flox", # speeds up Xarray
6059
"pyyaml>=6.0.1",
6160
"xmltodict",
6261
"scipy",
63-
"statsmodels",
6462
"matplotlib",
6563
"netCDF4>=1.6.4",
6664
"nc-time-axis>=1.4.1" # plot dependency of xarray
@@ -128,10 +126,11 @@ build = [
128126
features = ["dev"]
129127

130128
[[tool.hatch.envs.matrix_test.matrix]]
131-
python = ["3.9", "3.10", "3.11"]
129+
python = ["3.9", "3.10", "3.11", "3.12"]
132130

133131
[tool.hatch.envs.matrix_test.scripts]
134132
test = ["pytest ./src/ ./tests/",] # --doctest-modules
133+
fast-test = ["pytest ./tests/ -m \"not slow\"",]
135134

136135
[tool.pytest.ini_options]
137136
testpaths = ["tests"]

src/dtscalibration/calibrate_utils.py

Lines changed: 12 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,5 @@
11
import numpy as np
22
import scipy.sparse as sp
3-
import statsmodels.api as sm
43
import xarray as xr
54
from scipy.sparse import linalg as ln
65

@@ -1673,7 +1672,6 @@ def construct_submatrices_matching_sections(x, ix_sec, hix, tix, nt, trans_att):
16731672
Zero_d_eq12 : sparse matrix
16741673
Zero in EQ1 and EQ2
16751674
1676-
16771675
"""
16781676
# contains all indices of the entire fiber that either are using for
16791677
# calibrating to reference temperature or for matching sections. Is sorted.
@@ -1988,6 +1986,9 @@ def wls_sparse(
19881986
):
19891987
"""Weighted least squares solver.
19901988
1989+
Note: during development this solver was compared to the statsmodels solver. To
1990+
enable the comparison tests again, install `statsmodels` before running pytest.
1991+
19911992
If some initial estimate x0 is known and if damp == 0, one could proceed as follows:
19921993
- Compute a residual vector r0 = b - A*x0.
19931994
- Use LSQR to solve the system A*dx = r0.
@@ -2124,6 +2125,15 @@ def wls_stats(X, y, w=1.0, calc_cov=False, x0=None, return_werr=False, verbose=F
21242125
p_cov : ndarray
21252126
The covariance of the solution.
21262127
"""
2128+
try:
2129+
import statsmodels.api as sm
2130+
except ModuleNotFoundError as err:
2131+
msg = (
2132+
"Statsmodels has to be installed for this function.\n"
2133+
"Install it with `pip install statsmodels`."
2134+
)
2135+
raise ModuleNotFoundError(msg) from err
2136+
21272137
y = np.asarray(y)
21282138
w = np.asarray(w)
21292139

src/dtscalibration/calibration/section_utils.py

Lines changed: 0 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -43,7 +43,6 @@ def validate_no_overlapping_sections(sections: dict[str, list[slice]]):
4343
assert all_start_stop_startsort_flat == sorted(
4444
all_start_stop_startsort_flat
4545
), "Sections contains overlapping stretches"
46-
pass
4746

4847

4948
def validate_sections_definition(sections: dict[str, list[slice]]):
@@ -123,7 +122,6 @@ def validate_sections(ds: xr.Dataset, sections: dict[str, list[slice]]):
123122
f"Better define the {k} section. You tried {vi}, "
124123
"which is not within the x-dimension"
125124
)
126-
pass
127125

128126

129127
def ufunc_per_section(
@@ -176,7 +174,6 @@ def ufunc_per_section(
176174
TODO: Spend time on creating a slice instead of appendng everything\
177175
to a list and concatenating after.
178176
179-
180177
Returns:
181178
--------
182179
@@ -241,7 +238,6 @@ def ufunc_per_section(
241238
242239
>>> ix_loc = d.ufunc_per_section(sections, x_indices=True)
243240
244-
245241
Note:
246242
----
247243
If `self[label]` or `self[subtract_from_label]` is a Dask array, a Dask

0 commit comments

Comments
 (0)