Latest Threat Research:SANDWORM_MODE: Shai-Hulud-Style npm Worm Hijacks CI Workflows and Poisons AI Toolchains.Details
Socket
Book a DemoInstallSign in
Socket

impdar

Package Overview
Dependencies
Maintainers
1
Versions
26
Alerts
File Explorer

Advanced tools

Socket logo

Install Socket

Detect and block malicious and high-risk dependencies

Install

impdar - npm Package Compare versions

Comparing version
1.1.6
to
1.1.7
+4
-4
impdar.egg-info/PKG-INFO
Metadata-Version: 2.1
Name: impdar
Version: 1.1.6
Version: 1.1.7
Summary: Scripts for impulse radar
Home-page: http://github.com/dlilien/impdar
Author: David Lilien
Author-email: dal22@uw.edu
Author-email: dlilien@iu.edu
License: GNU GPL-3.0

@@ -15,3 +15,3 @@ Description: # ImpDAR: an impulse radar processor

ImpDAR is intended to be more flexible than other available options. Support is gradually being added for a variety of file formats. Currently, [GSSI](http://www.geophysical.com), [PulseEKKO](http://www.sensoft.ca), [Ramac](http://www.malagpr.com), [Blue Systems](http://www.bluesystem.ca/ice-penetrating-radar.html), DELORES, SEGY, [gprMAX](http://www.gprmax.com), Gecko, and legacy StoDeep files are supported. Available processing steps include various filtering operations, trivial modifications such as restacking, cropping, or reversing data, and a few different geolocation-related operations like interpolating to constant trace spacing. The integrated migration routines are in development but Stolt is working.
ImpDAR is intended to be more flexible than other available options. Support is gradually being added for a variety of file formats. Currently, [GSSI](http://www.geophysical.com), [PulseEKKO](http://www.sensoft.ca), [Ramac](http://www.malagpr.com), [Blue Systems](http://www.bluesystem.ca/ice-penetrating-radar.html), DELORES, SEGY, [gprMAX](http://www.gprmax.com), Gecko, and legacy StoDeep files are supported. ImpDAR can also read in MCoRDS files, though these are already processed so this would just be for tracing. Available processing steps include various filtering operations, trivial modifications such as restacking, cropping, or reversing data, and a few different geolocation-related operations like interpolating to constant trace spacing.

@@ -33,3 +33,3 @@ The primary interface to ImpDAR is through the command line, which allows efficient processing of large volumes of data. An API, centered around the RadarData class, is also available to allow the user to use ImpDAR in other programs.

#### Required
*Python 3* The package is tested on Python 3.7+. Older versions may work, but we have stopped testing on 2.7 since it has reached end of life. You can probably get 2.7 to work still, but no guarantees.
[Python 3](http://python.org) The package is tested on Python 3.7 to 3.10. It is probably best to upgrade ot one of those versions, but 3.6 is likely to work though not tested while older versions are unlikely to work. 3.11 should be fine on the ImpDAR side, though you may issues with finding prebuilt binaries for some dependencies.

@@ -36,0 +36,0 @@ You also need:

@@ -164,3 +164,3 @@ #! /usr/bin/env python

type=str,
choices=['shp', 'mat', 'segy'])
choices=convert.OUTPUT_FILETYPES)
parser_convert.add_argument('-in_fmt',

@@ -167,0 +167,0 @@ type=str,

@@ -288,2 +288,7 @@ #! /usr/bin/env python

(number of traces)')
parser_denoise.add_argument('--filt',
type=str,
choices=['weiner', 'median'],
default='weiner',
help='Filter type (Weiner or median with specified dimensions)')
_add_def_args(parser_denoise)

@@ -499,5 +504,5 @@

def denoise(dat, vert_win=1, hor_win=10, noise=None, filter_type='wiener', **kwargs):
def denoise(dat, vert_win=1, hor_win=10, noise=None, filt='wiener', **kwargs):
"""Despeckle."""
dat.denoise(vert_win=vert_win, hor_win=hor_win, noise=noise, ftype=filter_type)
dat.denoise(vert_win=vert_win, hor_win=hor_win, noise=noise, ftype=filt)

@@ -504,0 +509,0 @@

@@ -266,2 +266,3 @@ #! /usr/bin/env python

center.set_pickradius(5)
center.set_picker(5)
bottom.set_pickradius(5)

@@ -470,3 +471,2 @@ top.set_pickradius(5)

def _auto_click(self, event, point_color='m'):

@@ -473,0 +473,0 @@ """Click with auto on.

@@ -15,2 +15,5 @@ #! /usr/bin/env python

OUTPUT_FILETYPES = ['shp', 'gpkg', 'mat', 'sgy']
def convert(fns_in, out_fmt, t_srs=None, in_fmt=None, *args, **kwargs):

@@ -24,4 +27,4 @@ """Convert between formats. Mainly used to create shps and sgy files."""

if out_fmt not in ['shp', 'mat', 'sgy']:
raise ValueError('Can only convert to shp, mat, or sgy')
if out_fmt not in OUTPUT_FILETYPES:
raise ValueError('Can only convert to ' + ', '.join(OUTPUT_FILETYPES[:-1]) + ', or ' + OUTPUT_FILETYPES[-1])

@@ -64,3 +67,6 @@ # Treat this like batch input always

fn_out = os.path.splitext(data.fn)[0] + '.shp'
data.output_shp(fn_out, t_srs=t_srs)
data.output_ogr(fn_out, t_srs=t_srs, driver='ESRI Shapefile')
elif out_fmt == 'gpkg':
fn_out = os.path.splitext(data.fn)[0] + '.gpkg'
data.output_ogr(fn_out, t_srs=t_srs, driver='GPKG')
elif out_fmt == 'sgy':

@@ -67,0 +73,0 @@ if not load_segy.SEGY:

@@ -66,5 +66,5 @@ #! /usr/bin/env python

h5_data.dist = np.arange(h5_data.tnum)
h5_data.dist = np.arange(h5_data.tnum) / 1.0e3
h5_data.chan = -99.
h5_data.check_attrs()
return h5_data

@@ -51,2 +51,3 @@ #! /usr/bin/env python

mcords_data.lat = dst.variables['lat'][:]
mcords_data.elev = dst.variables['altitude'][:] - dst.variables['Surface'][:] * 3.0e8 / 2.0

@@ -93,3 +94,3 @@ # time has units of seconds according to documentation, but this seems wrong

except:
mat = h5py.File('AR_20140424_03_018.mat', 'r')
mat = h5py.File(fn_mat, 'r')

@@ -96,0 +97,0 @@ if ('Data' not in mat) or ('Longitude' not in mat):

@@ -182,7 +182,12 @@ #! /usr/bin/env python

with open(hdname, openmode_unicode) as fin:
if fin.read().find('1.5.340') != -1:
pe_data.version = '1.5.340'
fin_str = fin.read()
if fin_str.find('pulseEKKO') == -1:
pe_data.version = '1.0'
else:
pe_data.version = '1.0'
idx1 = fin_str.find('pulseEKKO')
idx2 = fin_str[idx1:].find('\n')
pe_data.version = fin_str[idx1+10:idx1+idx2]
fin.seek(0)
print(pe_data.version)
for i, line in enumerate(fin):

@@ -203,4 +208,5 @@ if 'TRACES' in line or 'NUMBER OF TRACES' in line:

doy = (int(line[:4]), int(line[5:7]), int(line[8:10]))
if i == 2 and pe_data.version == '1.5.340':
if i == 2 and pe_data.version != '1.0':
doy = (int(line[6:10]), int(line[:2]), int(line[3:5]))
day_offset = datetime.datetime(doy[0], doy[1], doy[2], 0, 0, 0)

@@ -210,3 +216,3 @@

pe_data.data = np.zeros((pe_data.snum, pe_data.tnum), dtype=np.int16)
elif pe_data.version == '1.5.340':
else:
pe_data.data = np.zeros((pe_data.snum, pe_data.tnum), dtype=np.float32)

@@ -226,3 +232,3 @@

offset += pe_data.snum * 2
elif pe_data.version == '1.5.340':
else:
fmt = '<%df' % (len(lines[offset: offset + pe_data.snum * 4]) // 4)

@@ -229,0 +235,0 @@ trace = struct.unpack(fmt, lines[offset:offset + pe_data.snum * 4])

@@ -34,3 +34,3 @@ #! /usr/bin/env python

segy_data.fn = fn_sgy
f = segyio.open(fn_sgy, ignore_geometry=True)
f = segyio.open(fn_sgy, ignore_geometry=True, endian='little')

@@ -37,0 +37,0 @@ segy_data.data = segyio.tools.collect(f.trace).transpose()

@@ -128,6 +128,6 @@ #! /usr/bin/env python

# complex data
if (len(UoA_data.data.dtype) == 2) or (UoA_data.data.dtype in [np.complex64, np.complex128]):
UoA_data.data = 10.0 * np.log10(np.sqrt(np.real(UoA_data.data) ** 2.0 + np.imag(UoA_data.data) ** 2.0))
else:
UoA_data.data = 10.0 * np.log10(UoA_data.data)
# if (len(UoA_data.data.dtype) == 2) or (UoA_data.data.dtype in [np.complex64, np.complex128]):
# UoA_data.data = 10.0 * np.log10(np.sqrt(np.real(UoA_data.data) ** 2.0 + np.imag(UoA_data.data) ** 2.0))
# else:
# UoA_data.data = 10.0 * np.log10(UoA_data.data)
UoA_data.snum, UoA_data.tnum = int(UoA_data.data.shape[0]), int(UoA_data.data.shape[1])

@@ -146,3 +146,15 @@ UoA_data.trace_num = np.arange(UoA_data.tnum) + 1

nminfo.elev = np.zeros_like(nminfo.lat)
print(nminfo.lat.shape)
print(nminfo.lon.shape)
print(nminfo.elev.shape)
if nminfo.lat.shape[0] > UoA_data.tnum:
nminfo.lat = nminfo.lat[:UoA_data.tnum]
if nminfo.lon.shape[0] > UoA_data.tnum:
nminfo.lon = nminfo.lon[:UoA_data.tnum]
if nminfo.elev.shape[0] > UoA_data.tnum:
nminfo.elev = nminfo.elev[:UoA_data.tnum]
len_min = np.min([nminfo.ppstime.shape[0], nminfo.lat.shape[0], nminfo.lon.shape[0]])

@@ -156,4 +168,4 @@ UoA_data.lat = interp1d(nminfo.ppstime[:len_min], nminfo.lat[:len_min], fill_value='extrapolate')(dt[:len_min])

if 'x' in grp:
UoA_data.x_coord = grp['x'][()]
UoA_data.y_coord = grp['y'][()]
UoA_data.x_coord = grp['x'][()][:UoA_data.tnum]
UoA_data.y_coord = grp['y'][()][:UoA_data.tnum]
else:

@@ -171,3 +183,7 @@ try:

UoA_data.trace_int = UoA_data.decday[1] - UoA_data.decday[0]
try:
UoA_data.trace_int = UoA_data.decday[1] - UoA_data.decday[0]
except:
UoA_data.trace_int = 1.0
UoA_data.pressure = np.zeros_like(UoA_data.decday)

@@ -174,0 +190,0 @@ UoA_data.trig = np.zeros_like(UoA_data.decday).astype(int)

@@ -22,3 +22,3 @@ /*

double integral;
double rs;
double r;
int Didx;

@@ -34,2 +34,5 @@ int lmin_prev, rmin_prev;

* based on c row-major ordering. Could be wrong though... */
/* Loop through every point in the image (trace and sample)
* to look for a hyperbola propagating away from that point */
for(j=0;j<tnum;j++){

@@ -48,19 +51,20 @@ if (j % 100 == 0){

/* Now look for the hyperbola propagating from this point */
/* Do left and right separately so we know when to break
* the break lets us skip a lot of iterations for long profiles */
for(k=j + 1;k<tnum;k++){
/* rs is the distance to the closest point in this trace??? */
rs = sqrt(pow(dist[k] - dist[j], 2.) + zs2[i]);
/* r is the distance from the surface to
* this point along the hyperbola */
r = sqrt(pow(dist[k] - dist[j], 2.) + zs2[i]);
/* We do not get closer as we go, so we can break the loop
* if we are far away */
if(2. * rs / vel > max_travel_time){
if(2. * r / vel > max_travel_time){
break;
}
costheta = zs[i] / rs;
min = 1.0e6;
min = max_travel_time;
/* The new min should not be smaller than the previous rmin
* since otherwise the hyperbola would be inverted (smiles) */
for(l=rmin_prev;l<snum;l++){
m = fabs(tt_sec[l] - 2. * rs / vel);
m = fabs(tt_sec[l] - 2. * r / vel);
if(m < min){

@@ -72,2 +76,3 @@ min = m;

rmin_prev = Didx;
costheta = zs[i] / r;
integral += gradD[Didx * tnum + k] * costheta / vel;

@@ -77,13 +82,9 @@ }

for(k=j;k>=0;k--){
rs = sqrt(pow(dist[k] - dist[j], 2.) + zs2[i]);
if(2. * rs / vel > max_travel_time){
r = sqrt(pow(dist[k] - dist[j], 2.) + zs2[i]);
if(2. * r / vel > max_travel_time){
break;
}
costheta = zs[i] / rs;
if (rs <= 1.0e8){
costheta = 0.;
}
min = 1.0e6;
min = max_travel_time;
for(l=lmin_prev;l<snum;l++){
m = fabs(tt_sec[l] - 2. * rs / vel);
m = fabs(tt_sec[l] - 2. * r / vel);
if(m < min){

@@ -95,6 +96,7 @@ min = m;

lmin_prev = Didx;
costheta = zs[i] / r;
integral += gradD[Didx * tnum + k] * costheta / vel;
}
migdata[i * tnum + j] = integral;
migdata[i * tnum + j] = integral / (2*M_PI);
}

@@ -101,0 +103,0 @@ if (j % 100 == 0){

@@ -36,3 +36,4 @@ #! /usr/bin/env python

def migrationKirchhoffLoop(data, migdata, tnum, snum, dist, zs, zs2, tt_sec, vel, gradD, max_travel_time, nearfield):
# Loop through all traces
# Loop through every point in the image (trace and sample)
# to look for a hyperbola propagating away from that point
print('Migrating trace number:')

@@ -42,9 +43,5 @@ for xi in range(tnum):

sys.stdout.flush()
# get the trace distance
x = dist[xi]
dists2 = (dist - x)**2.
# Loop through all samples
for ti in range(snum):
# get the radial distances between input point and output point
rs = np.sqrt(dists2 + zs2[ti])
# get the radial distances between points and the surface location for this trace
rs = np.sqrt((dist - dist[xi])**2. + zs2[ti])
# find the cosine of the angle of the tangent line, correct for obliquity factor

@@ -109,3 +106,15 @@ with np.errstate(invalid='ignore'):

migrationKirchhoffLoop(dat.data.astype(np.float64), migdata, dat.tnum, dat.snum, dat.dist.astype(np.float64), zs.astype(np.float64), zs2.astype(np.float64), tt_sec.astype(np.float64), vel, gradD.astype(np.float64), max_travel_time, nearfield)
migrationKirchhoffLoop(dat.data.astype(np.float64),
migdata,
dat.tnum,
dat.snum,
np.ascontiguousarray(dat.dist, dtype=np.float64) * 1.0e3,
np.ascontiguousarray(zs, dtype=np.float64),
np.ascontiguousarray(zs2, dtype=np.float64),
np.ascontiguousarray(tt_sec, dtype=np.float64),
vel,
np.ascontiguousarray(gradD, dtype=np.float64),
max_travel_time,
nearfield
)

@@ -151,3 +160,3 @@ dat.data = migdata.copy()

H,V = np.meshgrid(h,v)
dat.data *= H*V
dat.data = (dat.data*H*V).astype(dat.data.dtype)
# 2D Forward Fourier Transform to get data in frequency-wavenumber space, FK = D(kx,z=0,ws)

@@ -154,0 +163,0 @@ FK = np.fft.fft2(dat.data,(dat.snum,dat.tnum))[:dat.snum//2]

@@ -180,3 +180,3 @@ #! /usr/bin/env python

if dat.data.dtype in [np.complex128]:
if dat.data.dtype in [np.complex128, np.complex64, np.complex, complex]:
def norm(x):

@@ -183,0 +183,0 @@ return 10.0 * np.log10(np.absolute(x))

@@ -117,3 +117,3 @@ #! /usr/bin/env python

from ._RadarDataSaving import save, save_as_segy, output_shp, output_csv, \
_get_pick_targ_info
_get_pick_targ_info, output_ogr
from ._RadarDataFiltering import adaptivehfilt, horizontalfilt, highpass, \

@@ -120,0 +120,0 @@ winavg_hfilt, hfilt, vertical_band_pass, denoise, migrate, \

@@ -36,3 +36,4 @@ #! /usr/bin/env python

self.long = np.flip(self.long, 0)
self.elev = np.flip(self.elev, 0)
if self.elev is not None:
self.elev = np.flip(self.elev, 0)
if self.picks is not None:

@@ -284,4 +285,5 @@ self.picks.reverse()

ind = int(lim)
if not isinstance(ind, np.ndarray) or (dimension != 'pretrig'):
if not isinstance(ind, np.ndarray) or (dimension != 'pretrig'):
if top_or_bottom == 'top':

@@ -298,2 +300,8 @@ lims = [ind, self.data.shape[0]]

self.travel_time = self.travel_time - self.travel_time[0]
if self.nmo_depth is not None:
if top_or_bottom == 'top':
self.nmo_depth = self.nmo_depth[lims[0]:lims[1]]
else:
self.nmo_depth = self.nmo_depth[lims[0]:lims[1]]
self.snum = self.data.shape[0]

@@ -623,3 +631,3 @@ mintrig = 0

if hasattr(self, 'picks') and self.picks is not None:
self.picks.crop(-top_inds)
self.picks.crop(-top_inds - 1)

@@ -626,0 +634,0 @@ self.elevation = np.hstack((np.arange(np.max(self.elev), np.min(self.elev), -dz_avg),

@@ -124,3 +124,38 @@ #! /usr/bin/env python

If osgeo cannot be imported
*Deprecated since 1.1.7. Use output_ogr (with driver='ESRI Shapefile') instead.*
"""
from warnings import warn
warn('output_shp is deprecated since 1.1.7. Use output_ogr instead', DeprecationWarning)
return self.output_ogr(fn, t_srs=t_srs, target_out=None, driver='ESRI Shapefile')
def output_ogr(self, fn, t_srs=None, target_out=None, driver='ESRI Shapefile'):
"""Output a vector file of the traces.
If there are any picks, we want to output these.
If not, we will only output the tracenumber.
This function requires osr/gdal for shapefile/gpkg/etc creation.
I suggest exporting a csv if you don't want to deal with gdal.
Parameters
----------
fn: str
The filename of the output
t_srs: int, optional
EPSG number of the target spatial reference system. Default 4326 (wgs84)
target_out: str, optional
Used to overwrite the default output format of picks.
By default, try to write depth and if there is no nmo_depth use TWTT.
You might want to use this to get the output in TWTT or sample number
(options are depth, elev, twtt, snum)
driver: str, optional
The ogr driver to use. For shapefiles, use 'ESRI Shapefile' (default).
'GPKG' is another common option.
Raises
------
ImportError
If osgeo cannot be imported
"""
if not CONVERSIONS_ENABLED:

@@ -137,3 +172,3 @@ raise ImportError('osgeo could not be imported')

driver = ogr.GetDriverByName('ESRI Shapefile')
driver = ogr.GetDriverByName(driver)
data_source = driver.CreateDataSource(fn)

@@ -140,0 +175,0 @@ out_srs = osr.SpatialReference()

Metadata-Version: 2.1
Name: impdar
Version: 1.1.6
Version: 1.1.7
Summary: Scripts for impulse radar
Home-page: http://github.com/dlilien/impdar
Author: David Lilien
Author-email: dal22@uw.edu
Author-email: dlilien@iu.edu
License: GNU GPL-3.0

@@ -15,3 +15,3 @@ Description: # ImpDAR: an impulse radar processor

ImpDAR is intended to be more flexible than other available options. Support is gradually being added for a variety of file formats. Currently, [GSSI](http://www.geophysical.com), [PulseEKKO](http://www.sensoft.ca), [Ramac](http://www.malagpr.com), [Blue Systems](http://www.bluesystem.ca/ice-penetrating-radar.html), DELORES, SEGY, [gprMAX](http://www.gprmax.com), Gecko, and legacy StoDeep files are supported. Available processing steps include various filtering operations, trivial modifications such as restacking, cropping, or reversing data, and a few different geolocation-related operations like interpolating to constant trace spacing. The integrated migration routines are in development but Stolt is working.
ImpDAR is intended to be more flexible than other available options. Support is gradually being added for a variety of file formats. Currently, [GSSI](http://www.geophysical.com), [PulseEKKO](http://www.sensoft.ca), [Ramac](http://www.malagpr.com), [Blue Systems](http://www.bluesystem.ca/ice-penetrating-radar.html), DELORES, SEGY, [gprMAX](http://www.gprmax.com), Gecko, and legacy StoDeep files are supported. ImpDAR can also read in MCoRDS files, though these are already processed so this would just be for tracing. Available processing steps include various filtering operations, trivial modifications such as restacking, cropping, or reversing data, and a few different geolocation-related operations like interpolating to constant trace spacing.

@@ -33,3 +33,3 @@ The primary interface to ImpDAR is through the command line, which allows efficient processing of large volumes of data. An API, centered around the RadarData class, is also available to allow the user to use ImpDAR in other programs.

#### Required
*Python 3* The package is tested on Python 3.7+. Older versions may work, but we have stopped testing on 2.7 since it has reached end of life. You can probably get 2.7 to work still, but no guarantees.
[Python 3](http://python.org) The package is tested on Python 3.7 to 3.10. It is probably best to upgrade ot one of those versions, but 3.6 is likely to work though not tested while older versions are unlikely to work. 3.11 should be fine on the ImpDAR side, though you may issues with finding prebuilt binaries for some dependencies.

@@ -36,0 +36,0 @@ You also need:

@@ -7,3 +7,3 @@ # ImpDAR: an impulse radar processor

ImpDAR is intended to be more flexible than other available options. Support is gradually being added for a variety of file formats. Currently, [GSSI](http://www.geophysical.com), [PulseEKKO](http://www.sensoft.ca), [Ramac](http://www.malagpr.com), [Blue Systems](http://www.bluesystem.ca/ice-penetrating-radar.html), DELORES, SEGY, [gprMAX](http://www.gprmax.com), Gecko, and legacy StoDeep files are supported. Available processing steps include various filtering operations, trivial modifications such as restacking, cropping, or reversing data, and a few different geolocation-related operations like interpolating to constant trace spacing. The integrated migration routines are in development but Stolt is working.
ImpDAR is intended to be more flexible than other available options. Support is gradually being added for a variety of file formats. Currently, [GSSI](http://www.geophysical.com), [PulseEKKO](http://www.sensoft.ca), [Ramac](http://www.malagpr.com), [Blue Systems](http://www.bluesystem.ca/ice-penetrating-radar.html), DELORES, SEGY, [gprMAX](http://www.gprmax.com), Gecko, and legacy StoDeep files are supported. ImpDAR can also read in MCoRDS files, though these are already processed so this would just be for tracing. Available processing steps include various filtering operations, trivial modifications such as restacking, cropping, or reversing data, and a few different geolocation-related operations like interpolating to constant trace spacing.

@@ -25,3 +25,3 @@ The primary interface to ImpDAR is through the command line, which allows efficient processing of large volumes of data. An API, centered around the RadarData class, is also available to allow the user to use ImpDAR in other programs.

#### Required
*Python 3* The package is tested on Python 3.7+. Older versions may work, but we have stopped testing on 2.7 since it has reached end of life. You can probably get 2.7 to work still, but no guarantees.
[Python 3](http://python.org) The package is tested on Python 3.7 to 3.10. It is probably best to upgrade ot one of those versions, but 3.6 is likely to work though not tested while older versions are unlikely to work. 3.11 should be fine on the ImpDAR side, though you may issues with finding prebuilt binaries for some dependencies.

@@ -28,0 +28,0 @@ You also need:

@@ -49,3 +49,3 @@ #! /usr/bin/env python

version = '1.1.6'
version = '1.1.7'
packages = ['impdar',

@@ -73,3 +73,3 @@ 'impdar.lib',

author='David Lilien',
author_email='dal22@uw.edu',
author_email='dlilien@iu.edu',
license='GNU GPL-3.0',

@@ -90,3 +90,3 @@ entry_points={'console_scripts': console_scripts},

author='David Lilien',
author_email='dal22@uw.edu',
author_email='dlilien@iu.edu',
license='GNU GPL-3.0',

@@ -93,0 +93,0 @@ entry_points={'console_scripts': console_scripts},

Sorry, the diff of this file is too big to display