Combining data
For combining datasets or data arrays along a single dimension, see concatenate.
For combining datasets with different variables, see merge.
For combining datasets or data arrays with different indexes or missing values, see combine.
For combining datasets or data arrays along multiple dimensions see combining.multi.
Concatenate
To combine arrays along existing or new dimension into a larger array, youcan use concat()
. concat
takes an iterable of DataArray
or Dataset
objects, as well as a dimension name, and concatenates alongthat dimension:
- In [1]: arr = xr.DataArray(np.random.randn(2, 3),
- ...: [('x', ['a', 'b']), ('y', [10, 20, 30])])
- ...:
- In [2]: arr[:, :1]
- Out[2]:
- <xarray.DataArray (x: 2, y: 1)>
- array([[ 0.469112],
- [-1.135632]])
- Coordinates:
- * x (x) <U1 'a' 'b'
- * y (y) int64 10
- # this resembles how you would use np.concatenate
- In [3]: xr.concat([arr[:, :1], arr[:, 1:]], dim='y')
- Out[3]:
- <xarray.DataArray (x: 2, y: 3)>
- array([[ 0.469112, -0.282863, -1.509059],
- [-1.135632, 1.212112, -0.173215]])
- Coordinates:
- * x (x) <U1 'a' 'b'
- * y (y) int64 10 20 30
In addition to combining along an existing dimension, concat
can create anew dimension by stacking lower dimensional arrays together:
- In [4]: arr[0]
- Out[4]:
- <xarray.DataArray (y: 3)>
- array([ 0.469112, -0.282863, -1.509059])
- Coordinates:
- x <U1 'a'
- * y (y) int64 10 20 30
- # to combine these 1d arrays into a 2d array in numpy, you would use np.array
- In [5]: xr.concat([arr[0], arr[1]], 'x')
- Out[5]:
- <xarray.DataArray (x: 2, y: 3)>
- array([[ 0.469112, -0.282863, -1.509059],
- [-1.135632, 1.212112, -0.173215]])
- Coordinates:
- * y (y) int64 10 20 30
- * x (x) <U1 'a' 'b'
If the second argument to concat
is a new dimension name, the arrays willbe concatenated along that new dimension, which is always inserted as the firstdimension:
- In [6]: xr.concat([arr[0], arr[1]], 'new_dim')
- Out[6]:
- <xarray.DataArray (new_dim: 2, y: 3)>
- array([[ 0.469112, -0.282863, -1.509059],
- [-1.135632, 1.212112, -0.173215]])
- Coordinates:
- * y (y) int64 10 20 30
- x (new_dim) <U1 'a' 'b'
- Dimensions without coordinates: new_dim
The second argument to concat
can also be an Index
orDataArray
object as well as a string, in which case it isused to label the values along the new dimension:
- In [7]: xr.concat([arr[0], arr[1]], pd.Index([-90, -100], name='new_dim'))
- Out[7]:
- <xarray.DataArray (new_dim: 2, y: 3)>
- array([[ 0.469112, -0.282863, -1.509059],
- [-1.135632, 1.212112, -0.173215]])
- Coordinates:
- * y (y) int64 10 20 30
- x (new_dim) <U1 'a' 'b'
- * new_dim (new_dim) int64 -90 -100
Of course, concat
also works on Dataset
objects:
- In [8]: ds = arr.to_dataset(name='foo')
- In [9]: xr.concat([ds.sel(x='a'), ds.sel(x='b')], 'x')
- Out[9]:
- <xarray.Dataset>
- Dimensions: (x: 2, y: 3)
- Coordinates:
- * y (y) int64 10 20 30
- * x (x) <U1 'a' 'b'
- Data variables:
- foo (x, y) float64 0.4691 -0.2829 -1.509 -1.136 1.212 -0.1732
concat()
has a number of options which provide deeper controlover which variables are concatenated and how it handles conflicting variablesbetween datasets. With the default parameters, xarray will load some coordinatevariables into memory to compare them between datasets. This may be prohibitivelyexpensive if you are manipulating your dataset lazily using Parallel computing with Dask.
Merge
To combine variables and coordinates between multiple DataArray
and/orDataset
objects, use merge()
. It can merge a list ofDataset
, DataArray
or dictionaries of objects convertible toDataArray
objects:
- In [10]: xr.merge([ds, ds.rename({'foo': 'bar'})])
- Out[10]:
- <xarray.Dataset>
- Dimensions: (x: 2, y: 3)
- Coordinates:
- * x (x) <U1 'a' 'b'
- * y (y) int64 10 20 30
- Data variables:
- foo (x, y) float64 0.4691 -0.2829 -1.509 -1.136 1.212 -0.1732
- bar (x, y) float64 0.4691 -0.2829 -1.509 -1.136 1.212 -0.1732
- In [11]: xr.merge([xr.DataArray(n, name='var%d' % n) for n in range(5)])
- Out[11]:
- <xarray.Dataset>
- Dimensions: ()
- Data variables:
- var0 int64 0
- var1 int64 1
- var2 int64 2
- var3 int64 3
- var4 int64 4
If you merge another dataset (or a dictionary including data array objects), bydefault the resulting dataset will be aligned on the union of all indexcoordinates:
- In [12]: other = xr.Dataset({'bar': ('x', [1, 2, 3, 4]), 'x': list('abcd')})
- In [13]: xr.merge([ds, other])
- Out[13]:
- <xarray.Dataset>
- Dimensions: (x: 4, y: 3)
- Coordinates:
- * x (x) object 'a' 'b' 'c' 'd'
- * y (y) int64 10 20 30
- Data variables:
- foo (x, y) float64 0.4691 -0.2829 -1.509 -1.136 ... nan nan nan nan
- bar (x) int64 1 2 3 4
This ensures that merge
is non-destructive. xarray.MergeError
is raisedif you attempt to merge two variables with the same name but different values:
- In [14]: xr.merge([ds, ds + 1])
- MergeError: conflicting values for variable 'foo' on objects to be combined:
- first value: <xarray.Variable (x: 2, y: 3)>
- array([[ 0.4691123 , -0.28286334, -1.5090585 ],
- [-1.13563237, 1.21211203, -0.17321465]])
- second value: <xarray.Variable (x: 2, y: 3)>
- array([[ 1.4691123 , 0.71713666, -0.5090585 ],
- [-0.13563237, 2.21211203, 0.82678535]])
The same non-destructive merging between DataArray
index coordinates isused in the Dataset
constructor:
- In [15]: xr.Dataset({'a': arr[:-1], 'b': arr[1:]})
- Out[15]:
- <xarray.Dataset>
- Dimensions: (x: 2, y: 3)
- Coordinates:
- * x (x) object 'a' 'b'
- * y (y) int64 10 20 30
- Data variables:
- a (x, y) float64 0.4691 -0.2829 -1.509 nan nan nan
- b (x, y) float64 nan nan nan -1.136 1.212 -0.1732
Combine
The instance method combine_first()
combines twodatasets/data arrays and defaults to non-null values in the calling object,using values from the called object to fill holes. The resulting coordinatesare the union of coordinate labels. Vacant cells as a result of the outer-joinare filled with NaN
. For example:
- In [16]: ar0 = xr.DataArray([[0, 0], [0, 0]], [('x', ['a', 'b']), ('y', [-1, 0])])
- In [17]: ar1 = xr.DataArray([[1, 1], [1, 1]], [('x', ['b', 'c']), ('y', [0, 1])])
- In [18]: ar0.combine_first(ar1)
- Out[18]:
- <xarray.DataArray (x: 3, y: 3)>
- array([[ 0., 0., nan],
- [ 0., 0., 1.],
- [nan, 1., 1.]])
- Coordinates:
- * x (x) object 'a' 'b' 'c'
- * y (y) int64 -1 0 1
- In [19]: ar1.combine_first(ar0)
- Out[19]:
- <xarray.DataArray (x: 3, y: 3)>
- array([[ 0., 0., nan],
- [ 0., 1., 1.],
- [nan, 1., 1.]])
- Coordinates:
- * x (x) object 'a' 'b' 'c'
- * y (y) int64 -1 0 1
For datasets, ds0.combine_first(ds1)
works similarly toxr.merge([ds0, ds1])
, except that xr.merge
raises MergeError
whenthere are conflicting values in variables to be merged, whereas.combine_first
defaults to the calling object’s values.
Update
In contrast to merge
, update()
modifies a datasetin-place without checking for conflicts, and will overwrite any existingvariables with new values:
- In [20]: ds.update({'space': ('space', [10.2, 9.4, 3.9])})
- Out[20]:
- <xarray.Dataset>
- Dimensions: (space: 3, x: 2, y: 3)
- Coordinates:
- * x (x) <U1 'a' 'b'
- * y (y) int64 10 20 30
- * space (space) float64 10.2 9.4 3.9
- Data variables:
- foo (x, y) float64 0.4691 -0.2829 -1.509 -1.136 1.212 -0.1732
However, dimensions are still required to be consistent between differentDataset variables, so you cannot change the size of a dimension unless youreplace all dataset variables that use it.
update
also performs automatic alignment if necessary. Unlike merge
, itmaintains the alignment of the original array instead of merging indexes:
- In [21]: ds.update(other)
- Out[21]:
- <xarray.Dataset>
- Dimensions: (space: 3, x: 2, y: 3)
- Coordinates:
- * x (x) object 'a' 'b'
- * y (y) int64 10 20 30
- * space (space) float64 10.2 9.4 3.9
- Data variables:
- foo (x, y) float64 0.4691 -0.2829 -1.509 -1.136 1.212 -0.1732
- bar (x) int64 1 2
The exact same alignment logic when setting a variable with setitem
syntax:
- In [22]: ds['baz'] = xr.DataArray([9, 9, 9, 9, 9], coords=[('x', list('abcde'))])
- In [23]: ds.baz
- Out[23]:
- <xarray.DataArray 'baz' (x: 2)>
- array([9, 9])
- Coordinates:
- * x (x) object 'a' 'b'
Equals and identical
xarray objects can be compared by using the equals()
,identical()
andbroadcast_equals()
methods. These methods are used bythe optional compat
argument on concat
and merge
.
equals
checks dimension names, indexes and arrayvalues:
- In [24]: arr.equals(arr.copy())
- Out[24]: True
identical
also checks attributes, and the name of eachobject:
- In [25]: arr.identical(arr.rename('bar'))
- Out[25]: False
broadcast_equals
does a more relaxed form of equalitycheck that allows variables to have different dimensions, as long as valuesare constant along those new dimensions:
- In [26]: left = xr.Dataset(coords={'x': 0})
- In [27]: right = xr.Dataset({'x': [0, 0, 0]})
- In [28]: left.broadcast_equals(right)
- Out[28]: True
Like pandas objects, two xarray objects are still equal or identical if they havemissing values marked by NaN
in the same locations.
In contrast, the ==
operation performs element-wise comparison (likenumpy):
- In [29]: arr == arr.copy()
- Out[29]:
- <xarray.DataArray (x: 2, y: 3)>
- array([[ True, True, True],
- [ True, True, True]])
- Coordinates:
- * x (x) <U1 'a' 'b'
- * y (y) int64 10 20 30
Note that NaN
does not compare equal to NaN
in element-wise comparison;you may need to deal with missing values explicitly.
Merging with ‘no_conflicts’
The compat
argument 'no_conflicts'
is only available whencombining xarray objects with merge
. In addition to the above comparisonmethods it allows the merging of xarray objects with locations where _either_have NaN
values. This can be used to combine data with overlappingcoordinates as long as any non-missing values agree or are disjoint:
- In [30]: ds1 = xr.Dataset({'a': ('x', [10, 20, 30, np.nan])}, {'x': [1, 2, 3, 4]})
- In [31]: ds2 = xr.Dataset({'a': ('x', [np.nan, 30, 40, 50])}, {'x': [2, 3, 4, 5]})
- In [32]: xr.merge([ds1, ds2], compat='no_conflicts')
- Out[32]:
- <xarray.Dataset>
- Dimensions: (x: 5)
- Coordinates:
- * x (x) int64 1 2 3 4 5
- Data variables:
- a (x) float64 10.0 20.0 30.0 40.0 50.0
Note that due to the underlying representation of missing values as floatingpoint numbers (NaN
), variable data type is not always preserved when mergingin this manner.
Combining along multiple dimensions
Note
There are currently three combining functions with similar names:auto_combine()
, combine_by_coords()
, andcombine_nested()
. This is becauseauto_combine
is in the process of being deprecated in favour of the othertwo functions, which are more general. If your code currently relies onauto_combine
, then you will be able to get similar functionality by usingcombine_nested
.
For combining many objects along multiple dimensions xarray providescombine_nested`()
and combine_by_coords()
. Thesefunctions use a combination of concat
and merge
across differentvariables to combine many objects into one.
combine_nested`()
requires specifying the order in which theobjects should be combined, while combine_by_coords()
attempts toinfer this ordering automatically from the coordinates in the data.
combine_nested()
is useful when you know the spatialrelationship between each object in advance. The datasets must be provided inthe form of a nested list, which specifies their relative position andordering. A common task is collecting data from a parallelized simulation whereeach processor wrote out data to a separate file. A domain which was decomposedinto 4 parts, 2 each along both the x and y axes, requires organising thedatasets into a doubly-nested list, e.g:
- In [33]: arr = xr.DataArray(name='temperature', data=np.random.randint(5, size=(2, 2)), dims=['x', 'y'])
- In [34]: arr
- Out[34]:
- <xarray.DataArray 'temperature' (x: 2, y: 2)>
- array([[3, 4],
- [3, 2]])
- Dimensions without coordinates: x, y
- In [35]: ds_grid = [[arr, arr], [arr, arr]]
- In [36]: xr.combine_nested(ds_grid, concat_dim=['x', 'y'])
- Out[36]:
- <xarray.DataArray 'temperature' (x: 4, y: 4)>
- array([[3, 4, 3, 4],
- [3, 2, 3, 2],
- [3, 4, 3, 4],
- [3, 2, 3, 2]])
- Dimensions without coordinates: x, y
combine_nested()
can also be used to explicitly merge datasetswith different variables. For example if we have 4 datasets, which are dividedalong two times, and contain two different variables, we can pass None
to 'concat_dim'
to specify the dimension of the nested list over whichwe wish to use merge
instead of concat
:
- In [37]: temp = xr.DataArray(name='temperature', data=np.random.randn(2), dims=['t'])
- In [38]: precip = xr.DataArray(name='precipitation', data=np.random.randn(2), dims=['t'])
- In [39]: ds_grid = [[temp, precip], [temp, precip]]
- In [40]: xr.combine_nested(ds_grid, concat_dim=['t', None])
- Out[40]:
- <xarray.Dataset>
- Dimensions: (t: 4)
- Dimensions without coordinates: t
- Data variables:
- temperature (t) float64 0.4432 -0.1102 0.4432 -0.1102
- precipitation (t) float64 -0.1668 0.5011 -0.1668 0.5011
combine_by_coords()
is for combining objects which have dimensioncoordinates which specify their relationship to and order relative to oneanother, for example a linearly-increasing ‘time’ dimension coordinate.
Here we combine two datasets using their common dimension coordinates. Noticethey are concatenated in order based on the values in their dimensioncoordinates, not on their position in the list passed to combine_by_coords
.
- In [41]: x1 = xr.DataArray(name='foo', data=np.random.randn(3), coords=[('x', [0, 1, 2])])
- In [42]: x2 = xr.DataArray(name='foo', data=np.random.randn(3), coords=[('x', [3, 4, 5])])
- In [43]: xr.combine_by_coords([x2, x1])
- Out[43]:
- <xarray.Dataset>
- Dimensions: (x: 6)
- Coordinates:
- * x (x) int64 0 1 2 3 4 5
- Data variables:
- foo (x) float64 -0.3553 -0.3379 0.581 0.9838 0.0578 0.7619
These functions can be used by open_mfdataset()
to open manyfiles as one dataset. The particular function used is specified by setting theargument 'combine'
to 'by_coords'
or 'nested'
. This is useful forsituations where your data is split across many files in multiple locations,which have some known relationship between one another.