UnitsAwareDataArray
- class typhon.physics.units.tools.UnitsAwareDataArray(data: ~typing.Any = <NA>, coords: ~collections.abc.Sequence[~collections.abc.Sequence | ~pandas.core.indexes.base.Index | ~xarray.core.dataarray.DataArray] | ~collections.abc.Mapping | None = None, dims: str | ~collections.abc.Iterable[~collections.abc.Hashable] | None = None, name: ~collections.abc.Hashable | None = None, attrs: ~collections.abc.Mapping | None = None, indexes: ~collections.abc.Mapping[~typing.Any, ~xarray.core.indexes.Index] | None = None, fastpath: bool = False)[source]
Like xarray.DataArray, but transfers units
- __init__(data: ~typing.Any = <NA>, coords: ~collections.abc.Sequence[~collections.abc.Sequence | ~pandas.core.indexes.base.Index | ~xarray.core.dataarray.DataArray] | ~collections.abc.Mapping | None = None, dims: str | ~collections.abc.Iterable[~collections.abc.Hashable] | None = None, name: ~collections.abc.Hashable | None = None, attrs: ~collections.abc.Mapping | None = None, indexes: ~collections.abc.Mapping[~typing.Any, ~xarray.core.indexes.Index] | None = None, fastpath: bool = False) None
Methods
__init__([data, coords, dims, name, attrs, ...])all([dim, keep_attrs])Reduce this DataArray's data by applying
allalong some dimension(s).any([dim, keep_attrs])Reduce this DataArray's data by applying
anyalong some dimension(s).argmax([dim, axis, keep_attrs, skipna])Index or indices of the maximum of the DataArray over one or more dimensions.
argmin([dim, axis, keep_attrs, skipna])Index or indices of the minimum of the DataArray over one or more dimensions.
argsort([axis, kind, order])Returns the indices that would sort this array.
as_numpy()Coerces wrapped data and coordinates into numpy arrays, returning a DataArray.
assign_attrs(*args, **kwargs)Assign new attrs to this object.
assign_coords([coords])Assign new coordinates to this object.
astype(dtype, *[, order, casting, subok, ...])Copy of the xarray object, with data cast to a specified type.
bfill(dim[, limit])Fill NaN values by propagating values backward
broadcast_equals(other)Two DataArrays are broadcast equal if they are equal after broadcasting them against each other such that they have the same dimensions.
broadcast_like(other, *[, exclude])Broadcast this DataArray against another Dataset or DataArray.
chunk([chunks, name_prefix, token, lock, ...])Coerce this array's data into a dask arrays with the given chunks.
clip([min, max, keep_attrs])Return an array whose values are limited to
[min, max].close()Release any resources linked to this object.
coarsen([dim, boundary, side, coord_func])Coarsen object for DataArrays.
combine_first(other)Combine two DataArray objects, with union of coordinates.
compute(**kwargs)Manually trigger loading of this array's data from disk or a remote source into memory and return a new array.
conj()Complex-conjugate all elements.
Return the complex conjugate, element-wise.
convert_calendar(calendar[, dim, align_on, ...])Convert the DataArray to another calendar.
copy([deep, data])Returns a copy of this array.
count([dim, keep_attrs])Reduce this DataArray's data by applying
countalong some dimension(s).cumprod([dim, skipna, keep_attrs])Reduce this DataArray's data by applying
cumprodalong some dimension(s).cumsum([dim, skipna, keep_attrs])Reduce this DataArray's data by applying
cumsumalong some dimension(s).cumulative(dim[, min_periods])Accumulating object for DataArrays.
cumulative_integrate([coord, datetime_unit])Integrate cumulatively along the given coordinate using the trapezoidal rule.
curvefit(coords, func[, reduce_dims, ...])Curve fitting optimization for arbitrary functions.
diff(*args, **kwargs)Calculate the n-th order discrete difference along given axis.
differentiate(coord[, edge_order, datetime_unit])Differentiate the array with the second order accurate central differences.
dot(other[, dim])Perform dot product of two DataArrays along their shared dims.
drop([labels, dim, errors])Backward compatible method based on drop_vars and drop_sel
drop_duplicates(dim, *[, keep])Returns a new DataArray with duplicate dimension values removed.
Return a new DataArray without encoding on the array or any attached coords.
drop_indexes(coord_names, *[, errors])Drop the indexes assigned to the given coordinates.
drop_isel([indexers])Drop index positions from this DataArray.
drop_sel([labels, errors])Drop index labels from this DataArray.
drop_vars(names, *[, errors])Returns an array with dropped variables.
dropna(dim, *[, how, thresh])Returns a new array with dropped labels for missing values along the provided dimension.
equals(other)True if two DataArrays have the same dimensions, coordinates and values; otherwise False.
expand_dims([dim, axis, ...])Return a new object with an additional axis (or axes) inserted at the corresponding position in the array shape.
ffill(dim[, limit])Fill NaN values by propagating values forward
fillna(value)Fill missing values in this object.
from_dict(d)Convert a dictionary into an xarray.DataArray
from_iris(cube)Convert a iris.cube.Cube into an xarray.DataArray
from_series(series[, sparse])Convert a pandas.Series into an xarray.DataArray.
get_axis_num(dim)Return axis number(s) corresponding to dimension(s) in this array.
get_index(key)Get an index for a dimension, with fall-back to a default RangeIndex
groupby(group[, squeeze, restore_coord_dims])Returns a DataArrayGroupBy object for performing grouped operations.
groupby_bins(group, bins[, right, labels, ...])Returns a DataArrayGroupBy object for performing grouped operations.
head([indexers])Return a new DataArray whose data is given by the the first n values along the specified dimension(s).
identical(other)Like equals, but also checks the array name and attributes, and attributes on all coordinates.
idxmax([dim, skipna, fill_value, keep_attrs])Return the coordinate label of the maximum value along a dimension.
idxmin([dim, skipna, fill_value, keep_attrs])Return the coordinate label of the minimum value along a dimension.
integrate([coord, datetime_unit])Integrate along the given coordinate using the trapezoidal rule.
interp([coords, method, assume_sorted, kwargs])Interpolate a DataArray onto new coordinates
interp_calendar(target[, dim])Interpolates the DataArray to another calendar based on decimal year measure.
interp_like(other[, method, assume_sorted, ...])Interpolate this object onto the coordinates of another object, filling out of range values with NaN.
interpolate_na([dim, method, limit, ...])Fill in NaNs by interpolating according to different methods.
isel([indexers, drop, missing_dims])Return a new DataArray whose data is given by selecting indexes along the specified dimension(s).
isin(test_elements)Tests each value in the array for whether it is in test elements.
isnull([keep_attrs])Test each value in the array for whether it is a missing value.
item(*args)Copy an element of an array to a standard Python scalar and return it.
load(**kwargs)Manually trigger loading of this array's data from disk or a remote source into memory and return this array.
map_blocks(func[, args, kwargs, template])Apply a function to each block of this DataArray.
max([dim, skipna, keep_attrs])Reduce this DataArray's data by applying
maxalong some dimension(s).mean(*args, **kwargs)Reduce this DataArray's data by applying
meanalong some dimension(s).median(*args, **kwargs)Reduce this DataArray's data by applying
medianalong some dimension(s).min([dim, skipna, keep_attrs])Reduce this DataArray's data by applying
minalong some dimension(s).notnull([keep_attrs])Test each value in the array for whether it is not a missing value.
pad([pad_width, mode, stat_length, ...])Pad this array along one or more dimensions.
persist(**kwargs)Trigger computation in constituent dask arrays
pipe(func, *args, **kwargs)Apply
func(self, *args, **kwargs)polyfit(dim, deg[, skipna, rcond, w, full, cov])Least squares polynomial fit.
prod([dim, skipna, min_count, keep_attrs])Reduce this DataArray's data by applying
prodalong some dimension(s).quantile(q[, dim, method, keep_attrs, ...])Compute the qth quantile of the data along the specified dimension.
query([queries, parser, engine, missing_dims])Return a new data array indexed along the specified dimension(s), where the indexers are given as strings containing Python expressions to be evaluated against the values in the array.
rank(dim, *[, pct, keep_attrs])Ranks the data.
reduce(func[, dim, axis, keep_attrs, keepdims])Reduce this array by applying func along some dimension(s).
reindex([indexers, method, tolerance, copy, ...])Conform this object onto the indexes of another object, filling in missing values with
fill_value.reindex_like(other, *[, method, tolerance, ...])Conform this object onto the indexes of another object, for indexes which the objects share.
rename([new_name_or_name_dict])Returns a new DataArray with renamed coordinates, dimensions or a new name.
reorder_levels([dim_order])Rearrange index levels using input order.
resample([indexer, skipna, closed, label, ...])Returns a Resample object for performing resampling operations.
reset_coords([names, drop])Given names of coordinates, reset them to become variables.
reset_index(dims_or_levels[, drop])Reset the specified index(es) or multi-index level(s).
roll([shifts, roll_coords])Roll this array by an offset along one or more dimensions.
rolling([dim, min_periods, center])Rolling window object for DataArrays.
rolling_exp([window, window_type])Exponentially-weighted moving window.
round(*args, **kwargs)Round an array to the given number of decimals.
searchsorted(v[, side, sorter])Find indices where elements of v should be inserted in a to maintain order.
sel([indexers, method, tolerance, drop])Return a new DataArray whose data is given by selecting index labels along the specified dimension(s).
set_close(close)Register the function that releases any resources linked to this object.
set_index([indexes, append])Set DataArray (multi-)indexes using one or more existing coordinates.
set_xindex(coord_names[, index_cls])Set a new, Xarray-compatible index from one or more existing coordinate(s).
shift([shifts, fill_value])Shift this DataArray by an offset along one or more dimensions.
sortby(variables[, ascending])Sort object by labels or values (along an axis).
squeeze([dim, drop, axis])Return a new object with squeezed data.
stack([dim, create_index, index_cls])Stack any number of existing dimensions into a single new dimension.
std([dim, skipna, ddof, keep_attrs])Reduce this DataArray's data by applying
stdalong some dimension(s).sum(*args, **kwargs)Reduce this DataArray's data by applying
sumalong some dimension(s).swap_dims([dims_dict])Returns a new DataArray with swapped dimensions.
tail([indexers])Return a new DataArray whose data is given by the the last n values along the specified dimension(s).
thin([indexers])Return a new DataArray whose data is given by each n value along the specified dimension(s).
to(new_unit, *contexts, **kwargs)Convert to other unit.
to_dask_dataframe([dim_order, set_index])Convert this array into a dask.dataframe.DataFrame.
to_dataframe([name, dim_order])Convert this array and its coordinates into a tidy pandas.DataFrame.
to_dataset([dim, name, promote_attrs])Convert a DataArray to a Dataset.
to_dict([data, encoding])Convert this xarray.DataArray into a dictionary following xarray naming conventions.
to_index()Convert this variable to a pandas.Index.
to_iris()Convert this array into a iris.cube.Cube
to_masked_array([copy])Convert this array into a numpy.ma.MaskedArray
to_netcdf([path, mode, format, group, ...])Write DataArray contents to a netCDF file.
to_numpy()Coerces wrapped data to numpy and returns a numpy.ndarray.
Convert this array into a pandas object with the same shape.
Equivalent of pint's Quantity.to_root_units
Convert this array into a pandas.Series.
to_unstacked_dataset(dim[, level])Unstack DataArray expanding to Dataset along a given level of a stacked coordinate.
to_zarr([store, chunk_store, mode, ...])Write DataArray contents to a Zarr store
transpose(*dim[, transpose_coords, missing_dims])Return a new DataArray object with transposed dimensions.
Unify chunk size along all chunked dimensions of this DataArray.
unstack([dim, fill_value, sparse])Unstack existing dimensions corresponding to MultiIndexes into multiple new dimensions.
var([dim, skipna, ddof, keep_attrs])Reduce this DataArray's data by applying
varalong some dimension(s).weighted(weights)Weighted DataArray operations.
where(cond[, other, drop])Filter elements from this object according to a condition.
Attributes
TattrsDictionary storing arbitrary metadata with this array.
chunksTuple of block lengths for this dataarray's data, in order of dimensions, or None if the underlying data is not a dask array.
chunksizesMapping from dimension names to block lengths for this dataarray's data, or None if the underlying data is not a dask array.
coordsMapping of
DataArrayobjects corresponding to coordinate variables.dataThe DataArray's data as an array.
dimsTuple of dimension names associated with this array.
dtalias of
CombinedDatetimelikeAccessor[DataArray]dtypeData-type of the array’s elements.
encodingDictionary of format-specific settings for how this array should be serialized.
imagThe imaginary part of the array.
indexesMapping of pandas.Index objects used for label based indexing.
locAttribute for location based indexing like pandas.
nameThe name of this array.
nbytesTotal bytes consumed by the elements of this DataArray's data.
ndimNumber of array dimensions.
realThe real part of the array.
shapeTuple of array dimensions.
sizeNumber of elements in the array.
sizesOrdered mapping from dimension names to lengths.
stralias of
StringAccessor[DataArray]valuesThe array's data converted to numpy.ndarray.
variableLow level interface to the Variable object for this DataArray.
xindexesMapping of
Indexobjects used for label based indexing.