Skip to content

Commit 4a3aa8a

Browse files
davechrisDavid LahBSchilperoort
authored
IO AP sensing (Follow up of #219 and #218) (#220)
* Device ID bug for APSensing fixed. Device ID is N4386B instead of C320. C320 was an arbitrary name given for the wellbore by the user * more ap-sensing test data added. Advantage compared to existing data: .xml and .tra exported, which gives more metadata, readings of the instruments temperature sensors, and instrument computed log-ratio and attenuation. * apsensing io to read both .xml and .tra files added a check wheather a .tra file with identical timestamp exists in .xml directory. If yes it is used to: - import t_by_dts, log_ratio, loss - pt100 data (if it exists in file) * apsensing tra file parse updated based on Barts generic parser idea - NOT tested yet * cleaned up comments and docstrings in apsensing.py * added test to test_datastore.py inside fuction test_read_apsensing_files changelog updated * AP Sensing Device name changed in README.rst * corrected the linting errors by hatch run format in the files apsensing.py, sensornet.py and sensortran.py * Update CHANGELOG.rst Co-authored-by: Bart Schilperoort <[email protected]> * Update CHANGELOG.rst Co-authored-by: Bart Schilperoort <[email protected]> * Update CHANGELOG.rst Co-authored-by: Bart Schilperoort <[email protected]> * Update src/dtscalibration/io/apsensing.py printout language fix Co-authored-by: Bart Schilperoort <[email protected]> * - renamed ap sensing data explanation and changed it to markdown format - changed detection algorithm of .tra files - added a double-ended calibration test of ap_sensing_2 data to test_datastore.py * Bart's newly written append_to_data_vars_structure function added and tested. * added flag to load .tra array data only when specified. * automatic changes by hatch run format * added to docstring of io/apsensing.py - load_tra_arrays flag - current .tra implementation is limited to in-memory reading only * Move CI linting to Python 3.9 * Ignore weird mypy error * Fix broken notebook * updated docs/notebooks/A3Load_ap_sensing_files.ipynb - added explanation for .xml and .tra files - added an example of importing .xml + .tra data --------- Co-authored-by: David Lah <[email protected]> Co-authored-by: Bart Schilperoort <[email protected]> Co-authored-by: Bart Schilperoort <[email protected]>
1 parent b51db54 commit 4a3aa8a

32 files changed

+24134
-35
lines changed

.github/workflows/build.yml

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -16,10 +16,10 @@ jobs:
1616
fail-fast: false
1717
steps:
1818
- uses: actions/checkout@v3
19-
- name: Set up Python 3.10
19+
- name: Set up Python 3.9
2020
uses: actions/setup-python@v3
2121
with:
22-
python-version: "3.10"
22+
python-version: "3.9"
2323
- name: Python info
2424
shell: bash -l {0}
2525
run: |

CHANGELOG.rst

Lines changed: 12 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,18 @@
11

22
Changelog
33
=========
4+
3.0.4 (2024-08-30)
5+
---
6+
7+
Fixed
8+
9+
* device ID bug for APSensing. Device ID is N4386B instead of C320. C320 was an arbitrary name given for the wellbore by the user.
10+
11+
Added
12+
13+
* more test data from AP sensing device N4386B, which do also contain their .tra log files
14+
* AP sensing .tra support, as the reference temperature sensor data by this device in only logged in .tra and not in the .xml log files.
15+
added functions in io/apsensing.py to read .tra files if they are in the same directory as the .xml files.
416

517
3.0.3 (2024-04-18)
618
---

README.rst

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -88,7 +88,7 @@ Devices currently supported
8888
===========================
8989
* Silixa Ltd.: **Ultima** & **XT-DTS** .xml files *(up to version 8.1)*
9090
* Sensornet Ltd.: **Oryx**, **Halo** & **Sentinel** .ddf files
91-
* AP Sensing: **CP320** .xml files *(single ended only)*
91+
* AP Sensing: **N4386B** .xml files *(single ended only)*
9292
* SensorTran: **SensorTran 5100** .dat binary files *(single ended only)*
9393

9494
Documentation

docs/notebooks/04Calculate_variance_Stokes.ipynb

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -185,8 +185,8 @@
185185
"import scipy\n",
186186
"import numpy as np\n",
187187
"\n",
188-
"sigma = residuals.std()\n",
189-
"mean = residuals.mean()\n",
188+
"sigma = residuals.std().to_numpy()\n",
189+
"mean = residuals.mean().to_numpy()\n",
190190
"x = np.linspace(mean - 3 * sigma, mean + 3 * sigma, 100)\n",
191191
"approximated_normal_fit = scipy.stats.norm.pdf(x, mean, sigma)\n",
192192
"residuals.plot.hist(bins=50, figsize=(12, 8), density=True)\n",

docs/notebooks/A3Load_ap_sensing_files.ipynb

Lines changed: 151 additions & 13 deletions
Original file line numberDiff line numberDiff line change
@@ -5,12 +5,20 @@
55
"metadata": {},
66
"source": [
77
"# A3. Loading AP Sensing files\n",
8-
"This example loads AP sensing files. Only single-ended files are currently supported. Just like with Silixa's devices, the AP Sensing data is in .xml files"
8+
"This example loads AP sensing files. Only single-ended files are currently supported. \n",
9+
"\n",
10+
"The currently supported AP Sensing N4386B device has two data logging options to log into .xml files and .tra files. Only .xml files contain the stokes and anti-stokes intensities needed for this calibration. Unfortunately, these .xml files are scarce on metadata and do not contain the additionally connected sensors e.g. PT100 from the device. The latter are contained inside the .tra file.\n",
11+
"\n",
12+
"If you did not connect any additional sensors, you can use .xml files only and add your own logged temperature data to the datastore for calibration. (Hint: The .xml file export is well hidden in your AP Sensing software *DTS Configurator* and not documented in the user manual. Inside your *Configuration* turn *POSC export* on - this will export the .xml file.)\n",
13+
"\n",
14+
"If you want to additionally use data exported to .tra files (e.g. PT100 data) use the .tra logging make sure to enable *Auto Save Traces* in under *Program Options* and make sure *Create Multitrace files* and *Use Binary Format* are both disabled. Make sure to place the .tra files into the identical directory as the .xml files. Then they will be imported automatically with the *read_apsensing_files* commmand.\n",
15+
"\n",
16+
"The current implementation of .tra file parsing is limited to in-memory reading only."
917
]
1018
},
1119
{
1220
"cell_type": "code",
13-
"execution_count": null,
21+
"execution_count": 12,
1422
"metadata": {
1523
"execution": {
1624
"iopub.execute_input": "2022-04-06T08:12:29.520519Z",
@@ -36,7 +44,7 @@
3644
},
3745
{
3846
"cell_type": "code",
39-
"execution_count": null,
47+
"execution_count": 13,
4048
"metadata": {
4149
"execution": {
4250
"iopub.execute_input": "2022-04-06T08:12:31.219744Z",
@@ -45,15 +53,24 @@
4553
"shell.execute_reply": "2022-04-06T08:12:31.224123Z"
4654
}
4755
},
48-
"outputs": [],
56+
"outputs": [
57+
{
58+
"name": "stdout",
59+
"output_type": "stream",
60+
"text": [
61+
"..\\..\\tests\\data\\ap_sensing\n"
62+
]
63+
}
64+
],
4965
"source": [
5066
"filepath = os.path.join(\"..\", \"..\", \"tests\", \"data\", \"ap_sensing\")\n",
67+
"filepath_with_tra = os.path.join(\"..\", \"..\", \"tests\", \"data\", \"ap_sensing_2\", \"CH1_SE\")\n",
5168
"print(filepath)"
5269
]
5370
},
5471
{
5572
"cell_type": "code",
56-
"execution_count": null,
73+
"execution_count": 14,
5774
"metadata": {
5875
"execution": {
5976
"iopub.execute_input": "2022-04-06T08:12:31.254656Z",
@@ -62,7 +79,17 @@
6279
"shell.execute_reply": "2022-04-06T08:12:31.258995Z"
6380
}
6481
},
65-
"outputs": [],
82+
"outputs": [
83+
{
84+
"name": "stdout",
85+
"output_type": "stream",
86+
"text": [
87+
"_AP Sensing_N4386B_3_20180118201727.xml\n",
88+
"_AP Sensing_N4386B_3_20180118202957.xml\n",
89+
"_AP Sensing_N4386B_3_20180118205357.xml\n"
90+
]
91+
}
92+
],
6693
"source": [
6794
"filepathlist = sorted(glob.glob(os.path.join(filepath, \"*.xml\")))\n",
6895
"filenamelist = [os.path.basename(path) for path in filepathlist]\n",
@@ -80,7 +107,7 @@
80107
},
81108
{
82109
"cell_type": "code",
83-
"execution_count": null,
110+
"execution_count": 15,
84111
"metadata": {
85112
"execution": {
86113
"iopub.execute_input": "2022-04-06T08:12:31.262782Z",
@@ -89,23 +116,51 @@
89116
"shell.execute_reply": "2022-04-06T08:12:31.692317Z"
90117
}
91118
},
92-
"outputs": [],
119+
"outputs": [
120+
{
121+
"name": "stdout",
122+
"output_type": "stream",
123+
"text": [
124+
"3 files were found, each representing a single timestep\n",
125+
"4 recorded vars were found: LAF, TEMP, ST, AST\n",
126+
"Recorded at 7101 points along the cable\n",
127+
"The measurement is single ended\n",
128+
"Reading the data from disk\n",
129+
"3 files were found, each representing a single timestep\n",
130+
"4 recorded vars were found: LAF, TEMP, ST, AST\n",
131+
"Recorded at 1201 points along the cable\n",
132+
"The measurement is single ended\n",
133+
"Reading the data from disk\n",
134+
".tra files exist and will be read\n"
135+
]
136+
},
137+
{
138+
"name": "stderr",
139+
"output_type": "stream",
140+
"text": [
141+
"C:\\Users\\David Lah\\Documents\\dts-data-processing\\extern\\python-dts-calibration\\src\\dtscalibration\\io\\apsensing.py:480: UserWarning: Not all .xml files have a matching .tra file.\n",
142+
" Missing are time following timestamps {'20180118202957', '20180118201727', '20180118205357'}. Not loading .tra data.\n",
143+
" warnings.warn(msg)\n"
144+
]
145+
}
146+
],
93147
"source": [
94-
"ds = read_apsensing_files(directory=filepath)"
148+
"ds = read_apsensing_files(directory=filepath)\n",
149+
"ds_with_tra = read_apsensing_files(directory=filepath_with_tra)"
95150
]
96151
},
97152
{
98153
"cell_type": "markdown",
99154
"metadata": {},
100155
"source": [
101156
"The object tries to gather as much metadata from the measurement files as possible (temporal and spatial coordinates, filenames, temperature probes measurements). All other configuration settings are loaded from the first files and stored as attributes of the `xarray.Dataset`.\n",
102-
"\n",
157+
"y\n",
103158
"Calibration follows as usual (see the other notebooks)."
104159
]
105160
},
106161
{
107162
"cell_type": "code",
108-
"execution_count": null,
163+
"execution_count": 16,
109164
"metadata": {
110165
"execution": {
111166
"iopub.execute_input": "2022-04-06T08:12:31.695872Z",
@@ -114,10 +169,93 @@
114169
"shell.execute_reply": "2022-04-06T08:12:31.705163Z"
115170
}
116171
},
117-
"outputs": [],
172+
"outputs": [
173+
{
174+
"name": "stdout",
175+
"output_type": "stream",
176+
"text": [
177+
"<xarray.Dataset> Size: 569kB\n",
178+
"Dimensions: (x: 7101, time: 3)\n",
179+
"Coordinates:\n",
180+
" * x (x) float64 57kB 0.0 0.5 1.0 ... 3.549e+03 3.55e+03 3.55e+03\n",
181+
" filename (time) <U39 468B '_AP Sensing_N4386B_3_20180118201727.xml' ...\n",
182+
" * time (time) datetime64[ns] 24B 2018-01-18T20:17:27 ... 2018-01-1...\n",
183+
"Data variables:\n",
184+
" tmp (x, time) float64 170kB 12.16 11.32 12.26 ... 15.08 17.83\n",
185+
" st (x, time) float64 170kB 1.098 1.105 ... 3.39e-18 3.409e-18\n",
186+
" ast (x, time) float64 170kB 0.1888 0.1891 ... 4.838e-19 4.945e-19\n",
187+
" creationDate (time) datetime64[ns] 24B 2018-01-18T20:17:27 ... 2018-01-1...\n",
188+
"Attributes: (12/51)\n",
189+
" wellbore:uid: ...\n",
190+
" wellbore:name: ...\n",
191+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:uid: ...\n",
192+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:name: ...\n",
193+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
194+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
195+
" ... ...\n",
196+
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_2:columnIndex: ...\n",
197+
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:curveId: ...\n",
198+
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:columnIndex: ...\n",
199+
" isDoubleEnded: ...\n",
200+
" forwardMeasurementChannel: ...\n",
201+
" backwardMeasurementChannel: ...\n"
202+
]
203+
}
204+
],
118205
"source": [
119206
"print(ds)"
120207
]
208+
},
209+
{
210+
"cell_type": "code",
211+
"execution_count": 17,
212+
"metadata": {},
213+
"outputs": [
214+
{
215+
"name": "stdout",
216+
"output_type": "stream",
217+
"text": [
218+
"<xarray.DataArray 'probe1Temperature' (time: 3)> Size: 24B\n",
219+
"array([19.60636, 19.62306, 19.62306])\n",
220+
"Coordinates:\n",
221+
" filename (time) <U45 540B 'CH1_SE_AP Sensing_N4386B_1_20240130141820.xml...\n",
222+
" * time (time) datetime64[ns] 24B 2024-01-30T14:18:20 ... 2024-01-30T14...\n",
223+
"<xarray.Dataset> Size: 97kB\n",
224+
"Dimensions: (x: 1201, time: 3)\n",
225+
"Coordinates:\n",
226+
" * x (x) float64 10kB -50.0 -49.75 -49.5 ... 249.5 249.8 250.0\n",
227+
" filename (time) <U45 540B 'CH1_SE_AP Sensing_N4386B_1_202401301...\n",
228+
" * time (time) datetime64[ns] 24B 2024-01-30T14:18:20 ... 2024...\n",
229+
"Data variables:\n",
230+
" tmp (x, time) float64 29kB 22.49 22.85 23.14 ... 20.3 19.71\n",
231+
" st (x, time) float64 29kB 1.254 1.256 ... 0.8482 0.8397\n",
232+
" ast (x, time) float64 29kB 0.2453 0.2461 ... 0.163 0.1609\n",
233+
" creationDate (time) datetime64[ns] 24B 2024-01-30T14:18:20 ... 2024...\n",
234+
" probe1Temperature (time) float64 24B 19.61 19.62 19.62\n",
235+
" probe2Temperature (time) float64 24B 50.18 50.17 50.18\n",
236+
" probe3Temperature (time) float64 24B 18.57 18.6 18.56\n",
237+
" probe4Temperature (time) float64 24B 18.53 18.55 18.56\n",
238+
"Attributes: (12/51)\n",
239+
" wellbore:uid: ...\n",
240+
" wellbore:name: ...\n",
241+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:uid: ...\n",
242+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:name: ...\n",
243+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
244+
" wellbore:dtsInstalledSystemSet:dtsInstalledSystem:fiberInformation:fiber:...\n",
245+
" ... ...\n",
246+
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_2:columnIndex: ...\n",
247+
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:curveId: ...\n",
248+
" wellbore:wellLogSet:wellLog:blockInfo:blockCurveInfo_3:columnIndex: ...\n",
249+
" isDoubleEnded: ...\n",
250+
" forwardMeasurementChannel: ...\n",
251+
" backwardMeasurementChannel: ...\n"
252+
]
253+
}
254+
],
255+
"source": [
256+
"print(ds_with_tra.probe1Temperature)\n",
257+
"print(ds_with_tra)"
258+
]
121259
}
122260
],
123261
"metadata": {
@@ -136,7 +274,7 @@
136274
"name": "python",
137275
"nbconvert_exporter": "python",
138276
"pygments_lexer": "ipython3",
139-
"version": "3.9.11"
277+
"version": "3.10.4"
140278
}
141279
},
142280
"nbformat": 4,

src/dtscalibration/dts_accessor_utils.py

Lines changed: 7 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -717,7 +717,9 @@ def merge_double_ended(
717717
ds_fw.attrs["isDoubleEnded"] == "0" and ds_bw.attrs["isDoubleEnded"] == "0"
718718
), "(one of the) input DataStores is already double ended"
719719

720-
ds_fw, ds_bw = merge_double_ended_times(ds_fw, ds_bw, verbose=verbose, verify_timedeltas=verify_timedeltas)
720+
ds_fw, ds_bw = merge_double_ended_times(
721+
ds_fw, ds_bw, verbose=verbose, verify_timedeltas=verify_timedeltas
722+
)
721723

722724
ds = ds_fw.copy()
723725
ds_bw = ds_bw.copy()
@@ -741,10 +743,10 @@ def merge_double_ended(
741743
ds.attrs["isDoubleEnded"] = "1"
742744
ds["userAcquisitionTimeBW"] = ("time", ds_bw["userAcquisitionTimeFW"].values)
743745

744-
if plot_result:
745-
_, ax = plt.subplots()
746-
ds["st"].isel(time=0).plot(ax=ax, label="Stokes forward")
747-
ds["rst"].isel(time=0).plot(ax=ax, label="Stokes backward")
746+
if plot_result: # type: ignore
747+
_, ax = plt.subplots() # type: ignore
748+
ds["st"].isel(time=0).plot(ax=ax, label="Stokes forward") # type: ignore
749+
ds["rst"].isel(time=0).plot(ax=ax, label="Stokes backward") # type: ignore
748750
ax.legend()
749751

750752
return ds

0 commit comments

Comments
 (0)