sasdata.data_util package¶
Submodules¶
sasdata.data_util.err1d module¶
Error propogation algorithms for simple arithmetic
Warning: like the underlying numpy library, the inplace operations may return values of the wrong type if some of the arguments are integers, so be sure to create them with floating point inputs.
- sasdata.data_util.err1d.add(X, varX, Y, varY)¶
Addition with error propagation
- sasdata.data_util.err1d.add_inplace(X, varX, Y, varY)¶
In-place addition with error propagation
- sasdata.data_util.err1d.div(X, varX, Y, varY)¶
Division with error propagation
- sasdata.data_util.err1d.div_inplace(X, varX, Y, varY)¶
In-place division with error propagation
- sasdata.data_util.err1d.exp(X, varX)¶
Exponentiation with error propagation
- sasdata.data_util.err1d.log(X, varX)¶
Logarithm with error propagation
- sasdata.data_util.err1d.mul(X, varX, Y, varY)¶
Multiplication with error propagation
- sasdata.data_util.err1d.mul_inplace(X, varX, Y, varY)¶
In-place multiplication with error propagation
- sasdata.data_util.err1d.pow(X, varX, n)¶
X**n with error propagation
- sasdata.data_util.err1d.pow_inplace(X, varX, n)¶
In-place X**n with error propagation
- sasdata.data_util.err1d.sub(X, varX, Y, varY)¶
Subtraction with error propagation
- sasdata.data_util.err1d.sub_inplace(X, varX, Y, varY)¶
In-place subtraction with error propagation
sasdata.data_util.formatnum module¶
Format values and uncertainties nicely for printing.
format_uncertainty_pm()
produces the expanded format v +/- err.
format_uncertainty_compact()
produces the compact format v(##),
where the number in parenthesis is the uncertainty in the last two digits of v.
format_uncertainty()
uses the compact format by default, but this
can be changed to use the expanded +/- format by setting
format_uncertainty.compact to False.
The formatted string uses only the number of digits warranted by the uncertainty in the measurement.
If the uncertainty is 0 or not otherwise provided, the simple %g floating point format option is used.
Infinite and indefinite numbers are represented as inf and NaN.
Example:
>>> v,dv = 757.2356,0.01032
>>> print format_uncertainty_pm(v,dv)
757.236 +/- 0.010
>>> print format_uncertainty_compact(v,dv)
757.236(10)
>>> print format_uncertainty(v,dv)
757.236(10)
>>> format_uncertainty.compact = False
>>> print format_uncertainty(v,dv)
757.236 +/- 0.010
UncertaintyFormatter() returns a private formatter with its own formatter.compact flag.
- class sasdata.data_util.formatnum.UncertaintyFormatter¶
Bases:
object
Value and uncertainty formatter.
The formatter instance will use either the expanded v +/- dv form or the compact v(##) form depending on whether formatter.compact is True or False. The default is True.
- __call__(value, uncertainty)¶
Given value and uncertainty, return a string representation.
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.formatnum', '__doc__': '\n Value and uncertainty formatter.\n\n The *formatter* instance will use either the expanded v +/- dv form\n or the compact v(##) form depending on whether *formatter.compact* is\n True or False. The default is True.\n ', 'compact': True, '__call__': <function UncertaintyFormatter.__call__>, '__dict__': <attribute '__dict__' of 'UncertaintyFormatter' objects>, '__weakref__': <attribute '__weakref__' of 'UncertaintyFormatter' objects>, '__annotations__': {}})¶
- __doc__ = '\n Value and uncertainty formatter.\n\n The *formatter* instance will use either the expanded v +/- dv form\n or the compact v(##) form depending on whether *formatter.compact* is\n True or False. The default is True.\n '¶
- __module__ = 'sasdata.data_util.formatnum'¶
- __weakref__¶
list of weak references to the object
- compact = True¶
- sasdata.data_util.formatnum._format_uncertainty(value, uncertainty, compact)¶
Implementation of both the compact and the +/- formats.
- sasdata.data_util.formatnum.format_uncertainty_compact(value, uncertainty)¶
Given value v and uncertainty dv, return the compact representation v(##), where ## are the first two digits of the uncertainty.
- sasdata.data_util.formatnum.format_uncertainty_pm(value, uncertainty)¶
Given value v and uncertainty dv, return a string v +/- dv.
- sasdata.data_util.formatnum.main()¶
Run all tests.
This is equivalent to “nosetests –with-doctest”
- sasdata.data_util.formatnum.test_compact()¶
- sasdata.data_util.formatnum.test_default()¶
- sasdata.data_util.formatnum.test_pm()¶
sasdata.data_util.loader_exceptions module¶
Exceptions specific to loading data.
- exception sasdata.data_util.loader_exceptions.DataReaderException(e: str | None = None)¶
Bases:
Exception
Exception for files that were able to mostly load, but had minor issues along the way. Any exceptions of this type should be put into the datainfo.errors
- __doc__ = '\n Exception for files that were able to mostly load, but had minor issues\n along the way.\n Any exceptions of this type should be put into the datainfo.errors\n '¶
- __init__(e: str | None = None)¶
- __module__ = 'sasdata.data_util.loader_exceptions'¶
- __weakref__¶
list of weak references to the object
- exception sasdata.data_util.loader_exceptions.DefaultReaderException(e: str | None = None)¶
Bases:
Exception
Exception for files with no associated reader. This should be thrown by default readers only to tell Loader to try the next reader.
- __doc__ = '\n Exception for files with no associated reader. This should be thrown by\n default readers only to tell Loader to try the next reader.\n '¶
- __init__(e: str | None = None)¶
- __module__ = 'sasdata.data_util.loader_exceptions'¶
- __weakref__¶
list of weak references to the object
- exception sasdata.data_util.loader_exceptions.FileContentsException(e: str | None = None)¶
Bases:
Exception
Exception for files with an associated reader, but with no loadable data. This is useful for catching loader or file format issues.
- __doc__ = '\n Exception for files with an associated reader, but with no loadable data.\n This is useful for catching loader or file format issues.\n '¶
- __init__(e: str | None = None)¶
- __module__ = 'sasdata.data_util.loader_exceptions'¶
- __weakref__¶
list of weak references to the object
- exception sasdata.data_util.loader_exceptions.NoKnownLoaderException(e: str | None = None)¶
Bases:
Exception
Exception for files with no associated reader based on the file extension of the loaded file. This exception should only be thrown by loader.py.
- __doc__ = '\n Exception for files with no associated reader based on the file\n extension of the loaded file. This exception should only be thrown by\n loader.py.\n '¶
- __init__(e: str | None = None)¶
- __module__ = 'sasdata.data_util.loader_exceptions'¶
- __weakref__¶
list of weak references to the object
sasdata.data_util.manipulations module¶
Data manipulations for 2D data sets. Using the meta data information, various types of averaging are performed in Q-space
To test this module use:
`
cd test
PYTHONPATH=../src/ python2 -m sasdataloader.test.utest_averaging DataInfoTests.test_sectorphi_quarter
`
- class sasdata.data_util.manipulations.Binning(min_value, max_value, n_bins, base=None)¶
Bases:
object
This class just creates a binning object either linear or log
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n This class just creates a binning object\n either linear or log\n ', '__init__': <function Binning.__init__>, 'get_bin_index': <function Binning.get_bin_index>, '__dict__': <attribute '__dict__' of 'Binning' objects>, '__weakref__': <attribute '__weakref__' of 'Binning' objects>, '__annotations__': {}})¶
- __doc__ = '\n This class just creates a binning object\n either linear or log\n '¶
- __init__(min_value, max_value, n_bins, base=None)¶
- Parameters:
min_value – the value defining the start of the binning interval.
max_value – the value defining the end of the binning interval.
n_bins – the number of bins.
base – the base used for log, linear binning if None.
Beware that min_value should always be numerically smaller than max_value. Take particular care when binning angles across the 2pi to 0 discontinuity.
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- get_bin_index(value)¶
- Parameters:
value – the value in the binning interval whose bin index should be returned. Must be between min_value and max_value.
The general formula logarithm binning is: bin = floor(N * (log(x) - log(min)) / (log(max) - log(min)))
- class sasdata.data_util.manipulations.Boxavg(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0)¶
Bases:
Boxsum
Perform the average of counts in a 2D region of interest.
- __call__(data2D)¶
Perform the sum in the region of interest
- Parameters:
data2D – Data2D object
- Returns:
average counts, error on average counts
- __doc__ = '\n Perform the average of counts in a 2D region of interest.\n '¶
- __init__(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- class sasdata.data_util.manipulations.Boxcut(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0)¶
Bases:
object
Find a rectangular 2D region of interest.
- __call__(data2D)¶
Find a rectangular 2D region of interest.
- Parameters:
data2D – Data2D object
- Returns:
mask, 1d array (len = len(data)) with Trues where the data points are inside ROI, otherwise False
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Find a rectangular 2D region of interest.\n ', '__init__': <function Boxcut.__init__>, '__call__': <function Boxcut.__call__>, '_find': <function Boxcut._find>, '__dict__': <attribute '__dict__' of 'Boxcut' objects>, '__weakref__': <attribute '__weakref__' of 'Boxcut' objects>, '__annotations__': {}})¶
- __doc__ = '\n Find a rectangular 2D region of interest.\n '¶
- __init__(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- _find(data2D)¶
Find a rectangular 2D region of interest.
- Parameters:
data2D – Data2D object
- Returns:
out, 1d array (length = len(data)) with Trues where the data points are inside ROI, otherwise Falses
- class sasdata.data_util.manipulations.Boxsum(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0)¶
Bases:
object
Perform the sum of counts in a 2D region of interest.
- __annotations__ = {}¶
- __call__(data2D)¶
Perform the sum in the region of interest
- Parameters:
data2D – Data2D object
- Returns:
number of counts, error on number of counts, number of points summed
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Perform the sum of counts in a 2D region of interest.\n ', '__init__': <function Boxsum.__init__>, '__call__': <function Boxsum.__call__>, '_sum': <function Boxsum._sum>, '__dict__': <attribute '__dict__' of 'Boxsum' objects>, '__weakref__': <attribute '__weakref__' of 'Boxsum' objects>, '__annotations__': {}})¶
- __doc__ = '\n Perform the sum of counts in a 2D region of interest.\n '¶
- __init__(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- _sum(data2D)¶
Perform the sum in the region of interest
- Parameters:
data2D – Data2D object
- Returns:
number of counts, error on number of counts, number of entries summed
- class sasdata.data_util.manipulations.CircularAverage(r_min=0.0, r_max=0.0, bin_width=0.0005)¶
Bases:
object
Perform circular averaging on 2D data
The data returned is the distribution of counts as a function of Q
- __call__(data2D, ismask=False)¶
Perform circular averaging on the data
- Parameters:
data2D – Data2D object
- Returns:
Data1D object
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Perform circular averaging on 2D data\n\n The data returned is the distribution of counts\n as a function of Q\n ', '__init__': <function CircularAverage.__init__>, '__call__': <function CircularAverage.__call__>, '__dict__': <attribute '__dict__' of 'CircularAverage' objects>, '__weakref__': <attribute '__weakref__' of 'CircularAverage' objects>, '__annotations__': {}})¶
- __doc__ = '\n Perform circular averaging on 2D data\n\n The data returned is the distribution of counts\n as a function of Q\n '¶
- __init__(r_min=0.0, r_max=0.0, bin_width=0.0005)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- class sasdata.data_util.manipulations.Ring(r_min=0, r_max=0, center_x=0, center_y=0, nbins=36)¶
Bases:
object
Defines a ring on a 2D data set. The ring is defined by r_min, r_max, and the position of the center of the ring.
The data returned is the distribution of counts around the ring as a function of phi.
Phi_min and phi_max should be defined between 0 and 2*pi in anti-clockwise starting from the x- axis on the left-hand side
- __call__(data2D)¶
Apply the ring to the data set. Returns the angular distribution for a given q range
- Parameters:
data2D – Data2D object
- Returns:
Data1D object
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Defines a ring on a 2D data set.\n The ring is defined by r_min, r_max, and\n the position of the center of the ring.\n\n The data returned is the distribution of counts\n around the ring as a function of phi.\n\n Phi_min and phi_max should be defined between 0 and 2*pi\n in anti-clockwise starting from the x- axis on the left-hand side\n ', '__init__': <function Ring.__init__>, '__call__': <function Ring.__call__>, '__dict__': <attribute '__dict__' of 'Ring' objects>, '__weakref__': <attribute '__weakref__' of 'Ring' objects>, '__annotations__': {}})¶
- __doc__ = '\n Defines a ring on a 2D data set.\n The ring is defined by r_min, r_max, and\n the position of the center of the ring.\n\n The data returned is the distribution of counts\n around the ring as a function of phi.\n\n Phi_min and phi_max should be defined between 0 and 2*pi\n in anti-clockwise starting from the x- axis on the left-hand side\n '¶
- __init__(r_min=0, r_max=0, center_x=0, center_y=0, nbins=36)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- class sasdata.data_util.manipulations.Ringcut(r_min=0, r_max=0, center_x=0, center_y=0)¶
Bases:
object
Defines a ring on a 2D data set. The ring is defined by r_min, r_max, and the position of the center of the ring.
The data returned is the region inside the ring
Phi_min and phi_max should be defined between 0 and 2*pi in anti-clockwise starting from the x- axis on the left-hand side
- __call__(data2D)¶
Apply the ring to the data set. Returns the angular distribution for a given q range
- Parameters:
data2D – Data2D object
- Returns:
index array in the range
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Defines a ring on a 2D data set.\n The ring is defined by r_min, r_max, and\n the position of the center of the ring.\n\n The data returned is the region inside the ring\n\n Phi_min and phi_max should be defined between 0 and 2*pi\n in anti-clockwise starting from the x- axis on the left-hand side\n ', '__init__': <function Ringcut.__init__>, '__call__': <function Ringcut.__call__>, '__dict__': <attribute '__dict__' of 'Ringcut' objects>, '__weakref__': <attribute '__weakref__' of 'Ringcut' objects>, '__annotations__': {}})¶
- __doc__ = '\n Defines a ring on a 2D data set.\n The ring is defined by r_min, r_max, and\n the position of the center of the ring.\n\n The data returned is the region inside the ring\n\n Phi_min and phi_max should be defined between 0 and 2*pi\n in anti-clockwise starting from the x- axis on the left-hand side\n '¶
- __init__(r_min=0, r_max=0, center_x=0, center_y=0)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- class sasdata.data_util.manipulations.SectorPhi(r_min, r_max, phi_min=0, phi_max=6.283185307179586, nbins=20, base=None)¶
Bases:
_Sector
Sector average as a function of phi. I(phi) is return and the data is averaged over Q.
A sector is defined by r_min, r_max, phi_min, phi_max. The number of bin in phi also has to be defined.
- __call__(data2D)¶
Perform sector average and return I(phi).
- Parameters:
data2D – Data2D object
- Returns:
Data1D object
- __doc__ = '\n Sector average as a function of phi.\n I(phi) is return and the data is averaged over Q.\n\n A sector is defined by r_min, r_max, phi_min, phi_max.\n The number of bin in phi also has to be defined.\n '¶
- __module__ = 'sasdata.data_util.manipulations'¶
- class sasdata.data_util.manipulations.SectorQ(r_min, r_max, phi_min=0, phi_max=6.283185307179586, nbins=20, base=None)¶
Bases:
_Sector
Sector average as a function of Q for both wings. setting the _Sector.fold attribute determines whether or not the two sectors are averaged together (folded over) or separate. In the case of separate (not folded), the qs for the “minor wing” are arbitrarily set to a negative value. I(Q) is returned and the data is averaged over phi.
A sector is defined by r_min, r_max, phi_min, phi_max. where r_min, r_max, phi_min, phi_max >0. The number of bin in Q also has to be defined.
- __annotations__ = {}¶
- __call__(data2D)¶
Perform sector average and return I(Q).
- Parameters:
data2D – Data2D object
- Returns:
Data1D object
- __doc__ = '\n Sector average as a function of Q for both wings. setting the _Sector.fold\n attribute determines whether or not the two sectors are averaged together\n (folded over) or separate. In the case of separate (not folded), the\n qs for the "minor wing" are arbitrarily set to a negative value.\n I(Q) is returned and the data is averaged over phi.\n\n A sector is defined by r_min, r_max, phi_min, phi_max.\n where r_min, r_max, phi_min, phi_max >0.\n The number of bin in Q also has to be defined.\n '¶
- __module__ = 'sasdata.data_util.manipulations'¶
- class sasdata.data_util.manipulations.Sectorcut(phi_min=0, phi_max=3.141592653589793)¶
Bases:
object
Defines a sector (major + minor) region on a 2D data set. The sector is defined by phi_min, phi_max, where phi_min and phi_max are defined by the right and left lines wrt central line.
Phi_min and phi_max are given in units of radian and (phi_max-phi_min) should not be larger than pi
- __call__(data2D)¶
Find a rectangular 2D region of interest.
- Parameters:
data2D – Data2D object
- Returns:
mask, 1d array (len = len(data))
with Trues where the data points are inside ROI, otherwise False
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Defines a sector (major + minor) region on a 2D data set.\n The sector is defined by phi_min, phi_max,\n where phi_min and phi_max are defined by the right\n and left lines wrt central line.\n\n Phi_min and phi_max are given in units of radian\n and (phi_max-phi_min) should not be larger than pi\n ', '__init__': <function Sectorcut.__init__>, '__call__': <function Sectorcut.__call__>, '_find': <function Sectorcut._find>, '__dict__': <attribute '__dict__' of 'Sectorcut' objects>, '__weakref__': <attribute '__weakref__' of 'Sectorcut' objects>, '__annotations__': {}})¶
- __doc__ = '\n Defines a sector (major + minor) region on a 2D data set.\n The sector is defined by phi_min, phi_max,\n where phi_min and phi_max are defined by the right\n and left lines wrt central line.\n\n Phi_min and phi_max are given in units of radian\n and (phi_max-phi_min) should not be larger than pi\n '¶
- __init__(phi_min=0, phi_max=3.141592653589793)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- _find(data2D)¶
Find a rectangular 2D region of interest.
- Parameters:
data2D – Data2D object
- Returns:
out, 1d array (length = len(data))
with Trues where the data points are inside ROI, otherwise Falses
- class sasdata.data_util.manipulations.SlabX(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0, bin_width=0.001, fold=False)¶
Bases:
_Slab
Compute average I(Qx) for a region of interest
- __call__(data2D)¶
Compute average I(Qx) for a region of interest :param data2D: Data2D object :return: Data1D object
- __doc__ = '\n Compute average I(Qx) for a region of interest\n '¶
- __module__ = 'sasdata.data_util.manipulations'¶
- class sasdata.data_util.manipulations.SlabY(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0, bin_width=0.001, fold=False)¶
Bases:
_Slab
Compute average I(Qy) for a region of interest
- __annotations__ = {}¶
- __call__(data2D)¶
Compute average I(Qy) for a region of interest
- Parameters:
data2D – Data2D object
- Returns:
Data1D object
- __doc__ = '\n Compute average I(Qy) for a region of interest\n '¶
- __module__ = 'sasdata.data_util.manipulations'¶
- class sasdata.data_util.manipulations._Sector(r_min, r_max, phi_min=0, phi_max=6.283185307179586, nbins=20, base=None)¶
Bases:
object
Defines a sector region on a 2D data set. The sector is defined by r_min, r_max, phi_min and phi_max. phi_min and phi_max are defined by the right and left lines wrt a central line such that phi_max could be less than phi_min if they straddle the discontinuity from 2pi to 0.
Phi is defined between 0 and 2*pi in anti-clockwise starting from the negative x-axis.
- __annotations__ = {}¶
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Defines a sector region on a 2D data set.\n The sector is defined by r_min, r_max, phi_min and phi_max.\n phi_min and phi_max are defined by the right and left lines wrt a central\n line such that phi_max could be less than phi_min if they straddle the\n discontinuity from 2pi to 0.\n\n Phi is defined between 0 and 2*pi in anti-clockwise\n starting from the negative x-axis.\n ', '__init__': <function _Sector.__init__>, '_agv': <function _Sector._agv>, '__dict__': <attribute '__dict__' of '_Sector' objects>, '__weakref__': <attribute '__weakref__' of '_Sector' objects>, '__annotations__': {}})¶
- __doc__ = '\n Defines a sector region on a 2D data set.\n The sector is defined by r_min, r_max, phi_min and phi_max.\n phi_min and phi_max are defined by the right and left lines wrt a central\n line such that phi_max could be less than phi_min if they straddle the\n discontinuity from 2pi to 0.\n\n Phi is defined between 0 and 2*pi in anti-clockwise\n starting from the negative x-axis.\n '¶
- __init__(r_min, r_max, phi_min=0, phi_max=6.283185307179586, nbins=20, base=None)¶
- Parameters:
base – must be a valid base for an algorithm, i.e.,
a positive number
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- _agv(data2D, run='phi')¶
Perform sector averaging.
- Parameters:
data2D – Data2D object
run – define the varying parameter (‘phi’ , or ‘sector’)
- Returns:
Data1D object
- class sasdata.data_util.manipulations._Slab(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0, bin_width=0.001, fold=False)¶
Bases:
object
Compute average I(Q) for a region of interest
- __annotations__ = {}¶
- __call__(data2D)¶
Call self as a function.
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.manipulations', '__doc__': '\n Compute average I(Q) for a region of interest\n ', '__init__': <function _Slab.__init__>, '__call__': <function _Slab.__call__>, '_avg': <function _Slab._avg>, '__dict__': <attribute '__dict__' of '_Slab' objects>, '__weakref__': <attribute '__weakref__' of '_Slab' objects>, '__annotations__': {}})¶
- __doc__ = '\n Compute average I(Q) for a region of interest\n '¶
- __init__(x_min=0.0, x_max=0.0, y_min=0.0, y_max=0.0, bin_width=0.001, fold=False)¶
- __module__ = 'sasdata.data_util.manipulations'¶
- __weakref__¶
list of weak references to the object
- _avg(data2D, maj)¶
Compute average I(Q_maj) for a region of interest. The major axis is defined as the axis of Q_maj. The minor axis is the axis that we average over.
- Parameters:
data2D – Data2D object
maj_min – min value on the major axis
- Returns:
Data1D object
- sasdata.data_util.manipulations.flip_phi(phi: float) float ¶
Force phi to be within the 0 <= to <= 2pi range by adding or subtracting 2pi as necessary
- Returns:
phi in >=0 and <=2Pi
- sasdata.data_util.manipulations.get_dq_data(data2d: Data2D) array ¶
Get the dq for resolution averaging The pinholes and det. pix contribution present in both direction of the 2D which must be subtracted when converting to 1D: dq_overlap should calculated ideally at q = 0. Note This method works on only pinhole geometry. Extrapolate dqx(r) and dqy(phi) at q = 0, and take an average.
- sasdata.data_util.manipulations.get_intercept(q: float, q_0: float, q_1: float) float | None ¶
Returns the fraction of the side at which the q-value intercept the pixel, None otherwise. The values returned is the fraction ON THE SIDE OF THE LOWEST Q.
A B +-----------+--------+ <--- pixel size 0 1 Q_0 -------- Q ----- Q_1 <--- equivalent Q range if Q_1 > Q_0, A is returned if Q_1 < Q_0, B is returned if Q is outside the range of [Q_0, Q_1], None is returned
- sasdata.data_util.manipulations.get_pixel_fraction(q_max: float, q_00: float, q_01: float, q_10: float, q_11: float) float ¶
Returns the fraction of the pixel defined by the four corners (q_00, q_01, q_10, q_11) that has q < q_max.:
q_01 q_11 y=1 +--------------+ | | | | | | y=0 +--------------+ q_00 q_10 x=0 x=1
- sasdata.data_util.manipulations.get_pixel_fraction_square(x: float, x_min: float, x_max: float) float ¶
Return the fraction of the length from xmin to x.:
A B +-----------+---------+ xmin x xmax
- Parameters:
x – x-value
x_min – minimum x for the length considered
x_max – minimum x for the length considered
- Returns:
(x-xmin)/(xmax-xmin) when xmin < x < xmax
- sasdata.data_util.manipulations.get_q_compo(dx: float, dy: float, detector_distance: float, wavelength: float, compo: str | None = None) float ¶
This reduces tiny error at very large q. Implementation of this func is not started yet.<–ToDo
- sasdata.data_util.manipulations.position_and_wavelength_to_q(dx: float, dy: float, detector_distance: float, wavelength: float) float ¶
- Parameters:
dx – x-distance from beam center [mm]
dy – y-distance from beam center [mm]
detector_distance – sample to detector distance [mm]
wavelength – neutron wavelength [nm]
- Returns:
q-value at the given position
sasdata.data_util.nxsunit module¶
Define unit conversion support for NeXus style units.
The unit format is somewhat complicated. There are variant spellings and incorrect capitalization to worry about, as well as forms such as “mili*metre” and “1e-7 seconds”.
This is a minimal implementation of units including only what I happen to need now. It does not support the complete dimensional analysis provided by the package udunits on which NeXus is based, or even the units used in the NeXus definition files.
Unlike other units packages, this package does not carry the units along with the value but merely provides a conversion function for transforming values.
Usage example:
import nxsunit
u = nxsunit.Converter('mili*metre') # Units stored in mm
v = u(3000,'m') # Convert the value 3000 mm into meters
NeXus example:
# Load sample orientation in radians regardless of how it is stored.
# 1. Open the path
file.openpath('/entry1/sample/sample_orientation')
# 2. scan the attributes, retrieving 'units'
units = [for attr,value in file.attrs() if attr == 'units']
# 3. set up the converter (assumes that units actually exists)
u = nxsunit.Converter(units[0])
# 4. read the data and convert to the correct units
v = u(file.read(),'radians')
This is a standalone module, not relying on either DANSE or NeXus, and can be used for other unit conversion tasks.
Note: minutes are used for angle and seconds are used for time. We cannot tell what the correct interpretation is without knowing something about the fields themselves. If this becomes an issue, we will need to allow the application to set the dimension for the unit rather than inferring the dimension from an example unit.
- class sasdata.data_util.nxsunit.Converter(units: str | None = None, dimension: List[str] | None = None)¶
Bases:
object
Unit converter for NeXus style units.
The converter is initialized with the units of the source value. Various source values can then be converted to target values based on target value name.
- __call__(value: T, units: str | None = '') List[float] | T ¶
Call self as a function.
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.nxsunit', '__doc__': '\n Unit converter for NeXus style units.\n\n The converter is initialized with the units of the source value. Various\n source values can then be converted to target values based on target\n value name.\n ', '_units': None, 'dimension': None, 'scalemap': None, 'scalebase': None, 'scaleoffset': None, 'units': <property object>, '__init__': <function Converter.__init__>, 'scale': <function Converter.scale>, '_scale_with_offset': <function Converter._scale_with_offset>, '_get_scale_for_units': <function Converter._get_scale_for_units>, 'get_compatible_units': <function Converter.get_compatible_units>, '__call__': <function Converter.__call__>, '__dict__': <attribute '__dict__' of 'Converter' objects>, '__weakref__': <attribute '__weakref__' of 'Converter' objects>, '__annotations__': {'_units': 'List[str]', 'dimension': 'List[str]', 'scalemap': 'List[Dict[str, ConversionType]]', 'scalebase': 'float', 'scaleoffset': 'float', 'units': 'str'}})¶
- __doc__ = '\n Unit converter for NeXus style units.\n\n The converter is initialized with the units of the source value. Various\n source values can then be converted to target values based on target\n value name.\n '¶
- __init__(units: str | None = None, dimension: List[str] | None = None)¶
- __module__ = 'sasdata.data_util.nxsunit'¶
- __weakref__¶
list of weak references to the object
- _get_scale_for_units(units: List[str])¶
Protected method to get scale factor and scale offset as a combined value
- _scale_with_offset(value: float, scale_base: Tuple[float, float]) float ¶
Scale the given value and add the offset using the units string supplied
- _units: List[str] = None¶
Name of the source units (km, Ang, us, …)
- dimension: List[str] = None¶
Type of the source units (distance, time, frequency, …)
- get_compatible_units() List[str] ¶
Return a list of compatible units for the current Convertor object
- scale(units: str = '', value: T = None) List[float] | T ¶
Scale the given value using the units string supplied
- scalebase: float = None¶
Scale base for the source units
- scalemap: List[Dict[str, float | Tuple[float, float]]] = None¶
Scale converter, mapping unit name to scale factor or (scale, offset) for temperature units.
- scaleoffset: float = None¶
- property units: str¶
- class sasdata.data_util.nxsunit.TypeVar(name, *constraints, bound=None, covariant=False, contravariant=False)¶
Bases:
_Final
,_Immutable
,_BoundVarianceMixin
,_PickleUsingNameMixin
Type variable.
Usage:
T = TypeVar('T') # Can be anything A = TypeVar('A', str, bytes) # Must be str or bytes
Type variables exist primarily for the benefit of static type checkers. They serve as the parameters for generic types as well as for generic function definitions. See class Generic for more information on generic types. Generic functions work as follows:
- def repeat(x: T, n: int) -> List[T]:
‘’’Return a list containing n references to x.’’’ return [x]*n
- def longest(x: A, y: A) -> A:
‘’’Return the longest of two strings.’’’ return x if len(x) >= len(y) else y
The latter example’s signature is essentially the overloading of (str, str) -> str and (bytes, bytes) -> bytes. Also note that if the arguments are instances of some subclass of str, the return type is still plain str.
At runtime, isinstance(x, T) and issubclass(C, T) will raise TypeError.
Type variables defined with covariant=True or contravariant=True can be used to declare covariant or contravariant generic types. See PEP 484 for more details. By default generic types are invariant in all type variables.
Type variables can be introspected. e.g.:
T.__name__ == ‘T’ T.__constraints__ == () T.__covariant__ == False T.__contravariant__ = False A.__constraints__ == (str, bytes)
Note that only type variables defined in global scope can be pickled.
- __dict__ = mappingproxy({'__module__': 'typing', '__doc__': "Type variable.\n\n Usage::\n\n T = TypeVar('T') # Can be anything\n A = TypeVar('A', str, bytes) # Must be str or bytes\n\n Type variables exist primarily for the benefit of static type\n checkers. They serve as the parameters for generic types as well\n as for generic function definitions. See class Generic for more\n information on generic types. Generic functions work as follows:\n\n def repeat(x: T, n: int) -> List[T]:\n '''Return a list containing n references to x.'''\n return [x]*n\n\n def longest(x: A, y: A) -> A:\n '''Return the longest of two strings.'''\n return x if len(x) >= len(y) else y\n\n The latter example's signature is essentially the overloading\n of (str, str) -> str and (bytes, bytes) -> bytes. Also note\n that if the arguments are instances of some subclass of str,\n the return type is still plain str.\n\n At runtime, isinstance(x, T) and issubclass(C, T) will raise TypeError.\n\n Type variables defined with covariant=True or contravariant=True\n can be used to declare covariant or contravariant generic types.\n See PEP 484 for more details. By default generic types are invariant\n in all type variables.\n\n Type variables can be introspected. e.g.:\n\n T.__name__ == 'T'\n T.__constraints__ == ()\n T.__covariant__ == False\n T.__contravariant__ = False\n A.__constraints__ == (str, bytes)\n\n Note that only type variables defined in global scope can be pickled.\n ", '__init__': <function TypeVar.__init__>, '__typing_subst__': <function TypeVar.__typing_subst__>, '__dict__': <attribute '__dict__' of 'TypeVar' objects>, '__annotations__': {}})¶
- __doc__ = "Type variable.\n\n Usage::\n\n T = TypeVar('T') # Can be anything\n A = TypeVar('A', str, bytes) # Must be str or bytes\n\n Type variables exist primarily for the benefit of static type\n checkers. They serve as the parameters for generic types as well\n as for generic function definitions. See class Generic for more\n information on generic types. Generic functions work as follows:\n\n def repeat(x: T, n: int) -> List[T]:\n '''Return a list containing n references to x.'''\n return [x]*n\n\n def longest(x: A, y: A) -> A:\n '''Return the longest of two strings.'''\n return x if len(x) >= len(y) else y\n\n The latter example's signature is essentially the overloading\n of (str, str) -> str and (bytes, bytes) -> bytes. Also note\n that if the arguments are instances of some subclass of str,\n the return type is still plain str.\n\n At runtime, isinstance(x, T) and issubclass(C, T) will raise TypeError.\n\n Type variables defined with covariant=True or contravariant=True\n can be used to declare covariant or contravariant generic types.\n See PEP 484 for more details. By default generic types are invariant\n in all type variables.\n\n Type variables can be introspected. e.g.:\n\n T.__name__ == 'T'\n T.__constraints__ == ()\n T.__covariant__ == False\n T.__contravariant__ = False\n A.__constraints__ == (str, bytes)\n\n Note that only type variables defined in global scope can be pickled.\n "¶
- __init__(name, *constraints, bound=None, covariant=False, contravariant=False)¶
- __module__ = 'typing'¶
- __typing_subst__(arg)¶
- __weakref__¶
- sasdata.data_util.nxsunit._build_all_units()¶
Fill in the global variables DIMENSIONS and AMBIGUITIES for all available dimensions.
- sasdata.data_util.nxsunit._build_degree_units(name: str, symbol: str, conversion: float | Tuple[float, float]) Dict[str, float | Tuple[float, float]] ¶
Builds variations on the temperature unit name, including the degree symbol or the word degree.
- sasdata.data_util.nxsunit._build_inv_n_metric_units(unit: str, abbr: str, n: int = 2) Dict[str, float | Tuple[float, float]] ¶
Using the return from _build_metric_units, build inverse to the nth power variations on all units (1/x^n, invx^n, x^{-n} and x^-n)
- sasdata.data_util.nxsunit._build_inv_n_units(names: Sequence[str], conversion: float | Tuple[float, float], n: int = 2) Dict[str, float | Tuple[float, float]] ¶
Builds variations on inverse x to the nth power units, including 1/x^n, invx^n, x^-n and x^{-n}.
- sasdata.data_util.nxsunit._build_metric_units(unit: str, abbr: str) Dict[str, float] ¶
Construct standard SI names for the given unit. Builds e.g.,
s, ns, n*s, n_s second, nanosecond, nano*second, nano_second seconds, nanoseconds, nano*seconds, nano_seconds
Includes prefixes for femto through peta.
Ack! Allows, e.g., Coulomb and coulomb even though Coulomb is not a unit because some NeXus files store it that way!
Returns a dictionary of names and scales.
- sasdata.data_util.nxsunit._build_plural_units(**kw: Dict[str, float | Tuple[float, float]]) Dict[str, float | Tuple[float, float]] ¶
Construct names for the given units. Builds singular and plural form.
- sasdata.data_util.nxsunit._format_unit_structure(unit: str | None = None) List[str] ¶
Format units a common way :param unit: Unit string to be formatted :return: Formatted unit string
- sasdata.data_util.nxsunit.standardize_units(unit: str | None) List[str] ¶
Convert supplied units to a standard format for maintainability :param unit: Raw unit as supplied :return: Unit with known, reduced values
sasdata.data_util.registry module¶
File extension registry.
This provides routines for opening files based on extension, and registers the built-in file extensions.
- class sasdata.data_util.registry.CustomFileOpen(filename, mode='rb')¶
Bases:
object
Custom context manager to fetch file contents depending on where the file is located.
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.registry', '__doc__': 'Custom context manager to fetch file contents depending on where the file is located.', '__init__': <function CustomFileOpen.__init__>, '__enter__': <function CustomFileOpen.__enter__>, '__exit__': <function CustomFileOpen.__exit__>, '__dict__': <attribute '__dict__' of 'CustomFileOpen' objects>, '__weakref__': <attribute '__weakref__' of 'CustomFileOpen' objects>, '__annotations__': {}})¶
- __doc__ = 'Custom context manager to fetch file contents depending on where the file is located.'¶
- __enter__()¶
A context method that either fetches a file from a URL or opens a local file.
- __exit__(exc_type, exc_val, exc_tb)¶
Close all open file handles when exiting the context manager.
- __init__(filename, mode='rb')¶
- __module__ = 'sasdata.data_util.registry'¶
- __weakref__¶
list of weak references to the object
- class sasdata.data_util.registry.ExtensionRegistry¶
Bases:
object
Associate a file loader with an extension.
Note that there may be multiple loaders for the same extension.
Example:
registry = ExtensionRegistry() # Add an association by setting an element registry['.zip'] = unzip # Multiple extensions for one loader registry['.tgz'] = untar registry['.tar.gz'] = untar # Generic extensions to use after trying more specific extensions; # these will be checked after the more specific extensions fail. registry['.gz'] = gunzip # Multiple loaders for one extension registry['.cx'] = cx1 registry['.cx'] = cx2 registry['.cx'] = cx3 # Show registered extensions print registry.extensions() # Can also register a format name for explicit control from caller registry['cx3'] = cx3 print registry.formats() # Retrieve loaders for a file name registry.lookup('hello.cx') -> [cx3,cx2,cx1] # Run loader on a filename registry.load('hello.cx') -> try: return cx3('hello.cx') except: try: return cx2('hello.cx') except: return cx1('hello.cx') # Load in a specific format ignoring extension registry.load('hello.cx',format='cx3') -> return cx3('hello.cx')
- __contains__(ext: str) bool ¶
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.registry', '__doc__': "\n Associate a file loader with an extension.\n\n Note that there may be multiple loaders for the same extension.\n\n Example: ::\n\n registry = ExtensionRegistry()\n\n # Add an association by setting an element\n registry['.zip'] = unzip\n\n # Multiple extensions for one loader\n registry['.tgz'] = untar\n registry['.tar.gz'] = untar\n\n # Generic extensions to use after trying more specific extensions;\n # these will be checked after the more specific extensions fail.\n registry['.gz'] = gunzip\n\n # Multiple loaders for one extension\n registry['.cx'] = cx1\n registry['.cx'] = cx2\n registry['.cx'] = cx3\n\n # Show registered extensions\n print registry.extensions()\n\n # Can also register a format name for explicit control from caller\n registry['cx3'] = cx3\n print registry.formats()\n\n # Retrieve loaders for a file name\n registry.lookup('hello.cx') -> [cx3,cx2,cx1]\n\n # Run loader on a filename\n registry.load('hello.cx') ->\n try:\n return cx3('hello.cx')\n except:\n try:\n return cx2('hello.cx')\n except:\n return cx1('hello.cx')\n\n # Load in a specific format ignoring extension\n registry.load('hello.cx',format='cx3') ->\n return cx3('hello.cx')\n ", '__init__': <function ExtensionRegistry.__init__>, '__setitem__': <function ExtensionRegistry.__setitem__>, '__getitem__': <function ExtensionRegistry.__getitem__>, '__contains__': <function ExtensionRegistry.__contains__>, 'formats': <function ExtensionRegistry.formats>, 'extensions': <function ExtensionRegistry.extensions>, 'lookup': <function ExtensionRegistry.lookup>, 'load': <function ExtensionRegistry.load>, '__dict__': <attribute '__dict__' of 'ExtensionRegistry' objects>, '__weakref__': <attribute '__weakref__' of 'ExtensionRegistry' objects>, '__annotations__': {}})¶
- __doc__ = "\n Associate a file loader with an extension.\n\n Note that there may be multiple loaders for the same extension.\n\n Example: ::\n\n registry = ExtensionRegistry()\n\n # Add an association by setting an element\n registry['.zip'] = unzip\n\n # Multiple extensions for one loader\n registry['.tgz'] = untar\n registry['.tar.gz'] = untar\n\n # Generic extensions to use after trying more specific extensions;\n # these will be checked after the more specific extensions fail.\n registry['.gz'] = gunzip\n\n # Multiple loaders for one extension\n registry['.cx'] = cx1\n registry['.cx'] = cx2\n registry['.cx'] = cx3\n\n # Show registered extensions\n print registry.extensions()\n\n # Can also register a format name for explicit control from caller\n registry['cx3'] = cx3\n print registry.formats()\n\n # Retrieve loaders for a file name\n registry.lookup('hello.cx') -> [cx3,cx2,cx1]\n\n # Run loader on a filename\n registry.load('hello.cx') ->\n try:\n return cx3('hello.cx')\n except:\n try:\n return cx2('hello.cx')\n except:\n return cx1('hello.cx')\n\n # Load in a specific format ignoring extension\n registry.load('hello.cx',format='cx3') ->\n return cx3('hello.cx')\n "¶
- __getitem__(ext: str) List ¶
- __init__()¶
- __module__ = 'sasdata.data_util.registry'¶
- __setitem__(ext: str, loader)¶
- __weakref__¶
list of weak references to the object
- extensions() List[str] ¶
Return a sorted list of registered extensions.
- formats() List[str] ¶
Return a sorted list of the registered formats.
- load(path: str, ext: str | None = None) List[Data1D | Data2D] ¶
Call the loader for a single file.
Exceptions are stored in Data1D instances, with the errors in Data1D.errors
- lookup(path: str) List[callable] ¶
Return the loader associated with the file type of path.
- Parameters:
path – Data file path
- Returns:
List of available readers for the file extension (maybe empty)
- sasdata.data_util.registry.create_empty_data_with_errors(path: str | Path, errors: List[Exception])¶
Create a Data1D instance that only holds errors and a filepath. This allows all file paths to return a common data type, regardless if the data loading was successful or a failure.
sasdata.data_util.uncertainty module¶
Uncertainty propagation class for arithmetic, log and exp.
Based on scalars or numpy vectors, this class allows you to store and manipulate values+uncertainties, with propagation of gaussian error for addition, subtraction, multiplication, division, power, exp and log.
Storage properties are determined by the numbers used to set the value and uncertainty. Be sure to use floating point uncertainty vectors for inplace operations since numpy does not do automatic type conversion. Normal operations can use mixed integer and floating point. In place operations such as a *= b create at most one extra copy for each operation. By contrast, c = a*b uses four intermediate vectors, so shouldn’t be used for huge arrays.
- class sasdata.data_util.uncertainty.Uncertainty(x, variance=None)¶
Bases:
object
- __abs__()¶
- __add__(other)¶
- __and__(other)¶
- __coerce__()¶
- __complex__()¶
- __delitem__(key)¶
- __dict__ = mappingproxy({'__module__': 'sasdata.data_util.uncertainty', '_getdx': <function Uncertainty._getdx>, '_setdx': <function Uncertainty._setdx>, 'dx': <property object>, '__init__': <function Uncertainty.__init__>, '__len__': <function Uncertainty.__len__>, '__getitem__': <function Uncertainty.__getitem__>, '__setitem__': <function Uncertainty.__setitem__>, '__delitem__': <function Uncertainty.__delitem__>, '__add__': <function Uncertainty.__add__>, '__sub__': <function Uncertainty.__sub__>, '__mul__': <function Uncertainty.__mul__>, '__truediv__': <function Uncertainty.__truediv__>, '__pow__': <function Uncertainty.__pow__>, '__radd__': <function Uncertainty.__radd__>, '__rsub__': <function Uncertainty.__rsub__>, '__rmul__': <function Uncertainty.__rmul__>, '__rtruediv__': <function Uncertainty.__rtruediv__>, '__rpow__': <function Uncertainty.__rpow__>, '__iadd__': <function Uncertainty.__iadd__>, '__isub__': <function Uncertainty.__isub__>, '__imul__': <function Uncertainty.__imul__>, '__itruediv__': <function Uncertainty.__itruediv__>, '__ipow__': <function Uncertainty.__ipow__>, '__div__': <function Uncertainty.__div__>, '__rdiv__': <function Uncertainty.__rdiv__>, '__idiv__': <function Uncertainty.__idiv__>, '__neg__': <function Uncertainty.__neg__>, '__pos__': <function Uncertainty.__pos__>, '__abs__': <function Uncertainty.__abs__>, '__str__': <function Uncertainty.__str__>, '__repr__': <function Uncertainty.__repr__>, '__floordiv__': <function Uncertainty.__floordiv__>, '__mod__': <function Uncertainty.__mod__>, '__divmod__': <function Uncertainty.__divmod__>, '__lshift__': <function Uncertainty.__lshift__>, '__rshift__': <function Uncertainty.__rshift__>, '__and__': <function Uncertainty.__and__>, '__xor__': <function Uncertainty.__xor__>, '__or__': <function Uncertainty.__or__>, '__rfloordiv__': <function Uncertainty.__rfloordiv__>, '__rmod__': <function Uncertainty.__rmod__>, '__rdivmod__': <function Uncertainty.__rdivmod__>, '__rlshift__': <function Uncertainty.__rlshift__>, '__rrshift__': <function Uncertainty.__rrshift__>, '__rand__': <function Uncertainty.__rand__>, '__rxor__': <function Uncertainty.__rxor__>, '__ror__': <function Uncertainty.__ror__>, '__ifloordiv__': <function Uncertainty.__ifloordiv__>, '__imod__': <function Uncertainty.__imod__>, '__idivmod__': <function Uncertainty.__idivmod__>, '__ilshift__': <function Uncertainty.__ilshift__>, '__irshift__': <function Uncertainty.__irshift__>, '__iand__': <function Uncertainty.__iand__>, '__ixor__': <function Uncertainty.__ixor__>, '__ior__': <function Uncertainty.__ior__>, '__invert__': <function Uncertainty.__invert__>, '__complex__': <function Uncertainty.__complex__>, '__int__': <function Uncertainty.__int__>, '__long__': <function Uncertainty.__long__>, '__float__': <function Uncertainty.__float__>, '__oct__': <function Uncertainty.__oct__>, '__hex__': <function Uncertainty.__hex__>, '__index__': <function Uncertainty.__index__>, '__coerce__': <function Uncertainty.__coerce__>, 'log': <function Uncertainty.log>, 'exp': <function Uncertainty.exp>, '__dict__': <attribute '__dict__' of 'Uncertainty' objects>, '__weakref__': <attribute '__weakref__' of 'Uncertainty' objects>, '__doc__': None, '__annotations__': {}})¶
- __div__(other)¶
- __divmod__(other)¶
- __doc__ = None¶
- __float__()¶
- __floordiv__(other)¶
- __getitem__(key)¶
- __hex__()¶
- __iadd__(other)¶
- __iand__(other)¶
- __idiv__(other)¶
- __idivmod__(other)¶
- __ifloordiv__(other)¶
- __ilshift__(other)¶
- __imod__(other)¶
- __imul__(other)¶
- __index__()¶
- __init__(x, variance=None)¶
- __int__()¶
- __invert__()¶
- __ior__(other)¶
- __ipow__(other)¶
- __irshift__(other)¶
- __isub__(other)¶
- __itruediv__(other)¶
- __ixor__(other)¶
- __len__()¶
- __long__()¶
- __lshift__(other)¶
- __mod__(other)¶
- __module__ = 'sasdata.data_util.uncertainty'¶
- __mul__(other)¶
- __neg__()¶
- __oct__()¶
- __or__(other)¶
Return self|value.
- __pos__()¶
- __pow__(other)¶
- __radd__(other)¶
- __rand__(other)¶
- __rdiv__(other)¶
- __rdivmod__(other)¶
- __repr__()¶
Return repr(self).
- __rfloordiv__(other)¶
- __rlshift__(other)¶
- __rmod__(other)¶
- __rmul__(other)¶
- __ror__(other)¶
Return value|self.
- __rpow__(other)¶
- __rrshift__(other)¶
- __rshift__(other)¶
- __rsub__(other)¶
- __rtruediv__(other)¶
- __rxor__(other)¶
- __setitem__(key, value)¶
- __str__()¶
Return str(self).
- __sub__(other)¶
- __truediv__(other)¶
- __weakref__¶
list of weak references to the object
- __xor__(other)¶
- _getdx()¶
- _setdx(dx)¶
- property dx¶
standard deviation
- exp()¶
- log()¶
- sasdata.data_util.uncertainty.exp(val)¶
- sasdata.data_util.uncertainty.log(val)¶
- sasdata.data_util.uncertainty.test()¶
sasdata.data_util.util module¶
- sasdata.data_util.util.unique_preserve_order(seq: List[Any]) List[Any] ¶
Remove duplicates from list preserving order Fastest according to benchmarks at https://www.peterbe.com/plog/uniqifiers-benchmark