Lesson 6. Calculate Seasonal Summary Values from Climate Data Variables Stored in NetCDF 4 Format: Work With MACA v2 Climate Data in Python
Chapter 13 - NETCDF 4 Climate Data in Open Source Python
In this chapter, you will learn how to work with Climate Data Sets (MACA v2 for the United states) stored in netcdf 4 format using open source Python.
Learning Objectives
After completing this chapter, you will be able to:
- Summarize MACA v 2 climate data stored in netcdf 4 format by seasons across all time periods using
xarray
. - Summarize MACA v 2 climate data stored in netcdf 4 format by seasons and across years using
xarray
.
What You Need
You will need a computer with internet access to complete this lesson and …
Calculate Seasonal Averages Using MACA vs Climate Data
In this lesson, you will learn how to calculate seasonal averages over several years using MACA v 2 Climate Data downloaded in netcdf4
format using xarray
.
In this example you will use the forecast temperature data downloaded from the northwestknowledge.net website.
import os
import numpy as np
import pandas as pd
import matplotlib.pyplot as plt
import cartopy.crs as ccrs
import cartopy.feature as cfeature
import seaborn as sns
import geopandas as gpd
import earthpy as et
import xarray as xr
import regionmask
# Plotting options
sns.set(font_scale=1.3)
sns.set_style("white")
# Optional - set your working directory if you wish to use the data
# accessed lower down in this notebook (the USA state boundary data)
os.chdir(os.path.join(et.io.HOME,
'earth-analytics',
'data'))
To begin, you can download and open up a MACA v2 netcdf file. The file below is a projected maximum temperature dataset downscaled using the BNU-ESM
model for 2006-2099.
# Get netcdf file
data_path_monthly = 'http://thredds.northwestknowledge.net:8080/thredds/dodsC/agg_macav2metdata_tasmax_BNU-ESM_r1i1p1_rcp45_2006_2099_CONUS_monthly.nc'
# Open up the data
with xr.open_dataset(data_path_monthly) as file_nc:
monthly_forecast_temp_xr = file_nc
# xarray object
monthly_forecast_temp_xr
<xarray.Dataset> Dimensions: (lat: 585, crs: 1, lon: 1386, time: 1128) Coordinates: * lat (lat) float64 25.06 25.1 25.15 25.19 ... 49.31 49.35 49.4 * crs (crs) int32 1 * lon (lon) float64 235.2 235.3 235.3 235.4 ... 292.9 292.9 292.9 * time (time) object 2006-01-15 00:00:00 ... 2099-12-15 00:00:00 Data variables: air_temperature (time, lat, lon) float32 ... Attributes: (12/46) description: Multivariate Adaptive Constructed Analog... id: MACAv2-METDATA naming_authority: edu.uidaho.reacch Metadata_Conventions: Unidata Dataset Discovery v1.0 Metadata_Link: cdm_data_type: FLOAT ... ... contributor_role: Postdoctoral Fellow publisher_name: REACCH publisher_email: reacch@uidaho.edu publisher_url: http://www.reacchpna.org/ license: Creative Commons CC0 1.0 Universal Dedic... coordinate_system: WGS84,EPSG:4326
- lat: 585
- crs: 1
- lon: 1386
- time: 1128
- lat(lat)float6425.06 25.1 25.15 ... 49.35 49.4
- long_name :
- latitude
- standard_name :
- latitude
- units :
- degrees_north
- axis :
- Y
- description :
- Latitude of the center of the grid cell
array([25.063078, 25.104744, 25.14641 , ..., 49.312691, 49.354359, 49.396023])
- crs(crs)int321
- grid_mapping_name :
- latitude_longitude
- longitude_of_prime_meridian :
- 0.0
- semi_major_axis :
- 6378137.0
- inverse_flattening :
- 298.257223563
array([1], dtype=int32)
- lon(lon)float64235.2 235.3 235.3 ... 292.9 292.9
- units :
- degrees_east
- axis :
- X
- description :
- Longitude of the center of the grid cell
- long_name :
- longitude
- standard_name :
- longitude
array([235.227844, 235.269501, 235.311157, ..., 292.851929, 292.893585, 292.935242])
- time(time)object2006-01-15 00:00:00 ... 2099-12-...
- description :
- days since 1900-01-01
array([cftime.DatetimeNoLeap(2006, 1, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2006, 2, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2006, 3, 15, 0, 0, 0, 0, has_year_zero=True), ..., cftime.DatetimeNoLeap(2099, 10, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 11, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 12, 15, 0, 0, 0, 0, has_year_zero=True)], dtype=object)
- air_temperature(time, lat, lon)float32...
- long_name :
- Monthly Average of Daily Maximum Near-Surface Air Temperature
- units :
- K
- grid_mapping :
- crs
- standard_name :
- air_temperature
- height :
- 2 m
- cell_methods :
- time: maximum(interval: 24 hours);mean over days
- _ChunkSizes :
- [ 10 44 107]
[914593680 values with dtype=float32]
- description :
- Multivariate Adaptive Constructed Analogs (MACA) method, version 2.3,Dec 2013.
- id :
- MACAv2-METDATA
- naming_authority :
- edu.uidaho.reacch
- Metadata_Conventions :
- Unidata Dataset Discovery v1.0
- Metadata_Link :
- cdm_data_type :
- FLOAT
- title :
- Monthly aggregation of downscaled daily meteorological data of Monthly Average of Daily Maximum Near-Surface Air Temperature from College of Global Change and Earth System Science, Beijing Normal University (BNU-ESM) using the run r1i1p1 of the rcp45 scenario.
- summary :
- This archive contains monthly downscaled meteorological and hydrological projections for the Conterminous United States at 1/24-deg resolution. These monthly values are obtained by aggregating the daily values obtained from the downscaling using the Multivariate Adaptive Constructed Analogs (MACA, Abatzoglou, 2012) statistical downscaling method with the METDATA (Abatzoglou,2013) training dataset. The downscaled meteorological variables are maximum/minimum temperature(tasmax/tasmin), maximum/minimum relative humidity (rhsmax/rhsmin),precipitation amount(pr), downward shortwave solar radiation(rsds), eastward wind(uas), northward wind(vas), and specific humidity(huss). The downscaling is based on the 365-day model outputs from different global climate models (GCMs) from Phase 5 of the Coupled Model Inter-comparison Project (CMIP3) utlizing the historical (1950-2005) and future RCP4.5/8.5(2006-2099) scenarios.
- keywords :
- monthly, precipitation, maximum temperature, minimum temperature, downward shortwave solar radiation, specific humidity, wind velocity, CMIP5, Gridded Meteorological Data
- keywords_vocabulary :
- standard_name_vocabulary :
- CF-1.0
- history :
- No revisions.
- comment :
- geospatial_bounds :
- POLYGON((-124.7722 25.0631,-124.7722 49.3960, -67.0648 49.3960,-67.0648, 25.0631, -124.7722,25.0631))
- geospatial_lat_min :
- 25.0631
- geospatial_lat_max :
- 49.3960
- geospatial_lon_min :
- -124.7722
- geospatial_lon_max :
- -67.0648
- geospatial_lat_units :
- decimal degrees north
- geospatial_lon_units :
- decimal degrees east
- geospatial_lat_resolution :
- 0.0417
- geospatial_lon_resolution :
- 0.0417
- geospatial_vertical_min :
- 0.0
- geospatial_vertical_max :
- 0.0
- geospatial_vertical_resolution :
- 0.0
- geospatial_vertical_positive :
- up
- time_coverage_start :
- 2091-01-01T00:0
- time_coverage_end :
- 2095-12-31T00:00
- time_coverage_duration :
- P5Y
- time_coverage_resolution :
- P1M
- date_created :
- 2014-05-15
- date_modified :
- 2014-05-15
- date_issued :
- 2014-05-15
- creator_name :
- John Abatzoglou
- creator_url :
- http://maca.northwestknowledge.net
- creator_email :
- jabatzoglou@uidaho.edu
- institution :
- University of Idaho
- processing_level :
- GRID
- project :
- contributor_name :
- Katherine C. Hegewisch
- contributor_role :
- Postdoctoral Fellow
- publisher_name :
- REACCH
- publisher_email :
- reacch@uidaho.edu
- publisher_url :
- http://www.reacchpna.org/
- license :
- Creative Commons CC0 1.0 Universal Dedication(http://creativecommons.org/publicdomain/zero/1.0/legalcode)
- coordinate_system :
- WGS84,EPSG:4326
In the example below you subset data for the state of California similar to what you did in the previous lesson. You can select any state that you wish for this analysis!
# Download natural earth data to generate AOI
url = (
"https://naturalearth.s3.amazonaws.com/"
"50m_cultural/ne_50m_admin_1_states_provinces_lakes.zip"
)
states_gdf = gpd.read_file(url)
cali_aoi = states_gdf[states_gdf.name == "California"]
# Would this be better if it only returned 4 values?? probably so
# Helper Function to extract AOI
def get_aoi(shp, world=True):
"""Takes a geopandas object and converts it to a lat/ lon
extent """
lon_lat = {}
# Get lat min, max
aoi_lat = [float(shp.total_bounds[1]), float(shp.total_bounds[3])]
aoi_lon = [float(shp.total_bounds[0]), float(shp.total_bounds[2])]
# Handle the 0-360 lon values
if world:
aoi_lon[0] = aoi_lon[0] + 360
aoi_lon[1] = aoi_lon[1] + 360
lon_lat["lon"] = aoi_lon
lon_lat["lat"] = aoi_lat
return lon_lat
# Get lat min, max from Cali aoi extent
cali_bounds = get_aoi(cali_aoi)
# Slice by time & aoi location
start_date = "2059-12-15"
end_date = "2099-12-15"
cali_temp = monthly_forecast_temp_xr["air_temperature"].sel(
time=slice(start_date, end_date),
lon=slice(cali_bounds["lon"][0], cali_bounds["lon"][1]),
lat=slice(cali_bounds["lat"][0], cali_bounds["lat"][1]))
cali_temp
<xarray.DataArray 'air_temperature' (time: 481, lat: 227, lon: 246)> [26860002 values with dtype=float32] Coordinates: * lat (lat) float64 32.56 32.6 32.65 32.69 ... 41.85 41.9 41.94 41.98 * lon (lon) float64 235.6 235.7 235.7 235.8 ... 245.7 245.8 245.8 245.9 * time (time) object 2059-12-15 00:00:00 ... 2099-12-15 00:00:00 Attributes: long_name: Monthly Average of Daily Maximum Near-Surface Air Tempera... units: K grid_mapping: crs standard_name: air_temperature height: 2 m cell_methods: time: maximum(interval: 24 hours);mean over days _ChunkSizes: [ 10 44 107]
- time: 481
- lat: 227
- lon: 246
- ...
[26860002 values with dtype=float32]
- lat(lat)float6432.56 32.6 32.65 ... 41.94 41.98
- long_name :
- latitude
- standard_name :
- latitude
- units :
- degrees_north
- axis :
- Y
- description :
- Latitude of the center of the grid cell
array([32.562958, 32.604626, 32.64629 , ..., 41.896141, 41.937809, 41.979473])
- lon(lon)float64235.6 235.7 235.7 ... 245.8 245.9
- units :
- degrees_east
- axis :
- X
- description :
- Longitude of the center of the grid cell
- long_name :
- longitude
- standard_name :
- longitude
array([235.644501, 235.686157, 235.727829, ..., 245.769333, 245.811005, 245.852661])
- time(time)object2059-12-15 00:00:00 ... 2099-12-...
- description :
- days since 1900-01-01
array([cftime.DatetimeNoLeap(2059, 12, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 1, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 2, 15, 0, 0, 0, 0, has_year_zero=True), ..., cftime.DatetimeNoLeap(2099, 10, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 11, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 12, 15, 0, 0, 0, 0, has_year_zero=True)], dtype=object)
- long_name :
- Monthly Average of Daily Maximum Near-Surface Air Temperature
- units :
- K
- grid_mapping :
- crs
- standard_name :
- air_temperature
- height :
- 2 m
- cell_methods :
- time: maximum(interval: 24 hours);mean over days
- _ChunkSizes :
- [ 10 44 107]
print("Time Period start: ", cali_temp.time.min().values)
print("Time Period end: ", cali_temp.time.max().values)
Time Period start: 2059-12-15 00:00:00
Time Period end: 2099-12-15 00:00:00
# Create the region mask object - this is used to identify each region
# cali_region = regionmask.from_geopandas(cali_aoi,
# names="name",
# name="name",
# abbrevs="iso_3166_2")
cali_mask = regionmask.mask_3D_geopandas(cali_aoi,
monthly_forecast_temp_xr.lon,
monthly_forecast_temp_xr.lat)
# Mask the netcdf data
cali_temp_masked = cali_temp.where(cali_mask)
cali_temp_masked.dims
('time', 'lat', 'lon', 'region')
cali_temp.values.shape
(481, 227, 246)
Calculate the mean temperature for each season across the entire dataset. This will produce 4 arrays - one representing mean temperature for each seasons.
cali_season_summary = cali_temp_masked.groupby(
'time.season').mean('time', skipna=True)
# This will create 4 arrays - one for each season showing mean temperature values
cali_season_summary.shape
(4, 227, 246, 1)
Plot the seasonal data.
# Create a plot showing mean temperature aross seasons
cali_season_summary.plot(col='season', col_wrap=2, figsize=(10, 10))
plt.suptitle("Mean Temperature Across All Selected Years By Season \n California, USA",
y=1.05)
plt.show()
Calculate UnWeighted Seasonal Averages For By Season Across Each Year
Above you created one single value per season which summarized seasonal data across all years. However you may want to look at seasonal variation year to year in the projected data. You can calculate seasonal statistcs by
- resampling the data and then
- grouping the data and summarizing it
# Resample the data by season across all years
cali_season_mean_all_years = cali_temp_masked.resample(
time='QS-DEC', keep_attrs=True).mean()
cali_season_mean_all_years.shape
/opt/conda/envs/EDS/lib/python3.8/site-packages/xarray/core/common.py:1122: UserWarning: Passing ``keep_attrs`` to ``resample`` has no effect and will raise an error in xarray 0.20. Pass ``keep_attrs`` directly to the applied function, e.g. ``resample(...).mean(keep_attrs=True)``.
warnings.warn(
(161, 227, 246, 1)
# Summarize each array into one single (mean) value
cali_seasonal_mean = cali_season_mean_all_years.groupby('time').mean([
"lat", "lon"])
cali_seasonal_mean.shape
(161, 1)
# This data now has one value per season rather than an array
cali_seasonal_mean.shape
(161, 1)
# Plot the data
f, ax = plt.subplots(figsize=(10, 4))
cali_seasonal_mean.plot(marker="o",
color="grey",
markerfacecolor="purple",
markeredgecolor="purple")
ax.set(title="Seasonal Mean Temperature")
plt.show()
Export Seasonal Climate Project Data To .csv File
At this point you can convert the data to a dataframe and export it to a .csv
format if you wish.
# Convert to a dataframe
cali_seasonal_mean_df = cali_seasonal_mean.to_dataframe()
cali_seasonal_mean_df
air_temperature | ||
---|---|---|
time | region | |
2059-12-01 00:00:00 | 53 | 287.588531 |
2060-03-01 00:00:00 | 53 | 294.914917 |
2060-06-01 00:00:00 | 53 | 307.549652 |
2060-09-01 00:00:00 | 53 | 296.199188 |
2060-12-01 00:00:00 | 53 | 286.922699 |
... | ... | ... |
2098-12-01 00:00:00 | 53 | 289.853882 |
2099-03-01 00:00:00 | 53 | 296.710419 |
2099-06-01 00:00:00 | 53 | 308.563416 |
2099-09-01 00:00:00 | 53 | 300.005157 |
2099-12-01 00:00:00 | 53 | 289.929535 |
161 rows × 1 columns
# Export a csv file
cali_seasonal_mean_df.to_csv("cali-seasonal-temp.csv")
Plot Seasonal Data By Season
Using groupby()
you can group the data and plot it by season to better look at seasonal trends.
colors = {3: "grey", 6: "lightgreen", 9: "green", 12: "purple"}
seasons = {3: "spring", 6: "summer", 9: "fall", 12: "winter"}
f, ax = plt.subplots(figsize=(10, 7))
for month, arr in cali_seasonal_mean.groupby('time.month'):
arr.plot(ax=ax,
color="grey",
marker="o",
markerfacecolor=colors[month],
markeredgecolor=colors[month],
label=seasons[month])
ax.legend(bbox_to_anchor=(1.05, 1), loc='upper left')
ax.set(title="Seasonal Change in Mean Temperature Over Time")
plt.show()
Weighted Summary by Season
To begin, you will generate a list of days in each month which will be used to weight your seasonal summary data according to the the days in each month.
- TODO – redo this section to use the approach above which is perfect
# Calculate seasonal averages
# http://xarray.pydata.org/en/stable/examples/monthly-means.html
month_length = cali_temp_masked.time.dt.days_in_month
month_length
<xarray.DataArray 'days_in_month' (time: 481)> array([31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31]) Coordinates: * time (time) object 2059-12-15 00:00:00 ... 2099-12-15 00:00:00
- time: 481
- 31 31 28 31 30 31 30 31 31 30 31 ... 28 31 30 31 30 31 31 30 31 30 31
array([31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31, 31, 28, 31, 30, 31, 30, 31, 31, 30, 31, 30, 31])
- time(time)object2059-12-15 00:00:00 ... 2099-12-...
- description :
- days since 1900-01-01
array([cftime.DatetimeNoLeap(2059, 12, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 1, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 2, 15, 0, 0, 0, 0, has_year_zero=True), ..., cftime.DatetimeNoLeap(2099, 10, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 11, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 12, 15, 0, 0, 0, 0, has_year_zero=True)], dtype=object)
Next, divide the data grouped by season by the total days represented in each season to create weighted values
# This is returning values of 0 rather than na
# Calculate a weighted mean by season
cali_weighted_mean = ((cali_temp * month_length).resample(time='QS-DEC').sum() /
month_length.resample(time='QS-DEC').sum())
# Replace 0 values with nan
cali_weighted_mean = cali_weighted_mean.where(cali_weighted_mean)
cali_weighted_mean.shape
(161, 227, 246)
cali_weighted_season_value = cali_weighted_mean.groupby('time').mean([
"lat", "lon"])
cali_weighted_season_value.shape
(161,)
colors = {3: "grey", 6: "lightgreen", 9: "green", 12: "purple"}
seasons = {3: "Spring", 6: "Summer", 9: "Fall", 12: "Winter"}
f, (ax1, ax2) = plt.subplots(2, 1, figsize=(10, 7), sharey=True)
for month, arr in cali_weighted_season_value.groupby('time.month'):
arr.plot(ax=ax1,
color="grey",
marker="o",
markerfacecolor=colors[month],
markeredgecolor=colors[month],
label=seasons[month])
ax1.legend(bbox_to_anchor=(1.05, 1), loc='upper left')
ax1.set(title="Weighted Seasonal Change in Mean Temperature Over Time")
for month, arr in cali_seasonal_mean.groupby('time.month'):
arr.plot(ax=ax2,
color="grey",
marker="o",
markerfacecolor=colors[month],
markeredgecolor=colors[month],
label=seasons[month])
ax2.set(title="Unweighted Seasonal Change in Mean Temperature Over Time")
f.tight_layout()
plt.show()
If you want, you can compare the difference between weighted vs unweighted values.
# What does the difference look like weighted vs unweighted?
cali_seasonal_mean - cali_weighted_season_value
<xarray.DataArray (time: 161, region: 1)> array([[2.42445398], [1.1149551 ], [0.56349408], [1.41986191], [2.10024405], [0.99747663], [0.74377694], [1.67514298], [1.65447404], [0.6276056 ], [0.30815426], [1.46946358], [1.82464189], [1.19897048], [0.32238954], [1.51911652], [2.03105749], [1.18102475], [0.23554512], [1.7812482 ], ... [1.04313289], [0.54936444], [1.29146021], [2.11720145], [0.86377328], [0.17050946], [1.50163027], [2.07188164], [1.04584555], [0.58235629], [1.44673595], [2.39297506], [1.09318667], [0.62666739], [1.67848387], [1.81829381], [0.85855153], [0.46202606], [1.72129072], [1.99271339]]) Coordinates: * time (time) object 2059-12-01 00:00:00 ... 2099-12-01 00:00:00 * region (region) int64 53
- time: 161
- region: 1
- 2.424 1.115 0.5635 1.42 2.1 0.9975 ... 1.818 0.8586 0.462 1.721 1.993
array([[2.42445398], [1.1149551 ], [0.56349408], [1.41986191], [2.10024405], [0.99747663], [0.74377694], [1.67514298], [1.65447404], [0.6276056 ], [0.30815426], [1.46946358], [1.82464189], [1.19897048], [0.32238954], [1.51911652], [2.03105749], [1.18102475], [0.23554512], [1.7812482 ], ... [1.04313289], [0.54936444], [1.29146021], [2.11720145], [0.86377328], [0.17050946], [1.50163027], [2.07188164], [1.04584555], [0.58235629], [1.44673595], [2.39297506], [1.09318667], [0.62666739], [1.67848387], [1.81829381], [0.85855153], [0.46202606], [1.72129072], [1.99271339]])
- time(time)object2059-12-01 00:00:00 ... 2099-12-...
array([cftime.DatetimeNoLeap(2059, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2061, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2061, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2061, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2061, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2062, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2062, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2062, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2062, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2063, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2063, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2063, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2063, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2064, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2064, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2064, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2064, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2065, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2065, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2065, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2065, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2066, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2066, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2066, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2066, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2067, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2067, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2067, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2067, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2068, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2068, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2068, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2068, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2069, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2069, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2069, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2069, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2070, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2070, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2070, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2070, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2071, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2071, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2071, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2071, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2072, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2072, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2072, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2072, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2073, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2073, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2073, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2073, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2074, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2074, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2074, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2074, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2075, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2075, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2075, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2075, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2076, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2076, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2076, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2076, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2077, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2077, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2077, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2077, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2078, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2078, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2078, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2078, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2079, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2079, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2079, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2079, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2080, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2080, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2080, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2080, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2081, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2081, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2081, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2081, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2082, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2082, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2082, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2082, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2083, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2083, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2083, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2083, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2084, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2084, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2084, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2084, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2085, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2085, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2085, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2085, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2086, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2086, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2086, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2086, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2087, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2087, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2087, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2087, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2088, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2088, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2088, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2088, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2089, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2089, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2089, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2089, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2090, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2090, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2090, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2090, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2091, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2091, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2091, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2091, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2092, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2092, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2092, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2092, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2093, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2093, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2093, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2093, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2094, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2094, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2094, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2094, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2095, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2095, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2095, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2095, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2096, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2096, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2096, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2096, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2097, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2097, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2097, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2097, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2098, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2098, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2098, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2098, 12, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 3, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 6, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 9, 1, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 12, 1, 0, 0, 0, 0, has_year_zero=True)], dtype=object)
- region(region)int6453
array([53])
The Same Analysis for the West Coast
Above you calculate seasonal summaries for the state of California. You can implement the same analysis for each aoi region in a shapefile if you want following the workflow that you learned in the previous lesson.
# Create AOI Subset
cali_or_wash_nev = states_gdf[states_gdf.name.isin(
["California", "Oregon", "Washington", "Nevada"])]
west_bounds = get_aoi(cali_or_wash_nev)
# Create the mask
west_mask = regionmask.mask_3D_geopandas(cali_or_wash_nev,
monthly_forecast_temp_xr.lon,
monthly_forecast_temp_xr.lat)
# Slice by time & aoi location
start_date = "2059-12-15"
end_date = "2099-12-15"
west_temp = monthly_forecast_temp_xr["air_temperature"].sel(
time=slice(start_date, end_date),
lon=slice(west_bounds["lon"][0], west_bounds["lon"][1]),
lat=slice(west_bounds["lat"][0], west_bounds["lat"][1]))
# Apply the mask
west_temp_masked = west_temp.where(west_mask)
west_temp_masked
# Resample the data by season across all years
#west_season_mean_all_years = west_temp_masked.groupby('region').resample(time='QS-DEC', keep_attrs=True).mean()
# cali_seasonal_mean = cali_season_mean_all_years.groupby('time').mean(["lat", "lon"])
# cali_seasonal_mean
<xarray.DataArray 'air_temperature' (time: 481, lat: 395, lon: 256, region: 4)> array([[[[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., ... ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]]]], dtype=float32) Coordinates: * lat (lat) float64 32.56 32.6 32.65 32.69 ... 48.85 48.9 48.94 48.98 * lon (lon) float64 235.3 235.4 235.4 235.4 ... 245.8 245.9 245.9 245.9 * time (time) object 2059-12-15 00:00:00 ... 2099-12-15 00:00:00 * region (region) int64 53 82 86 96 Attributes: long_name: Monthly Average of Daily Maximum Near-Surface Air Tempera... units: K grid_mapping: crs standard_name: air_temperature height: 2 m cell_methods: time: maximum(interval: 24 hours);mean over days _ChunkSizes: [ 10 44 107]
- time: 481
- lat: 395
- lon: 256
- region: 4
- nan nan nan nan nan nan nan nan ... nan nan nan nan nan nan nan nan
array([[[[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., ... ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]], [[nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan], ..., [nan, nan, nan, nan], [nan, nan, nan, nan], [nan, nan, nan, nan]]]], dtype=float32)
- lat(lat)float6432.56 32.6 32.65 ... 48.94 48.98
- long_name :
- latitude
- standard_name :
- latitude
- units :
- degrees_north
- axis :
- Y
- description :
- Latitude of the center of the grid cell
array([32.562958, 32.604626, 32.64629 , ..., 48.89603 , 48.937698, 48.979362])
- lon(lon)float64235.3 235.4 235.4 ... 245.9 245.9
- units :
- degrees_east
- axis :
- X
- description :
- Longitude of the center of the grid cell
- long_name :
- longitude
- standard_name :
- longitude
array([235.311157, 235.352844, 235.394501, ..., 245.852661, 245.894333, 245.936005])
- time(time)object2059-12-15 00:00:00 ... 2099-12-...
- description :
- days since 1900-01-01
array([cftime.DatetimeNoLeap(2059, 12, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 1, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2060, 2, 15, 0, 0, 0, 0, has_year_zero=True), ..., cftime.DatetimeNoLeap(2099, 10, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 11, 15, 0, 0, 0, 0, has_year_zero=True), cftime.DatetimeNoLeap(2099, 12, 15, 0, 0, 0, 0, has_year_zero=True)], dtype=object)
- region(region)int6453 82 86 96
array([53, 82, 86, 96])
- long_name :
- Monthly Average of Daily Maximum Near-Surface Air Temperature
- units :
- K
- grid_mapping :
- crs
- standard_name :
- air_temperature
- height :
- 2 m
- cell_methods :
- time: maximum(interval: 24 hours);mean over days
- _ChunkSizes :
- [ 10 44 107]
# This produces a raster for each season over time across regions
west_coast_mean_temp_raster = west_temp_masked.resample(
time='QS-DEC', keep_attrs=True).mean()
west_coast_mean_temp_raster.shape
/opt/conda/envs/EDS/lib/python3.8/site-packages/xarray/core/common.py:1122: UserWarning: Passing ``keep_attrs`` to ``resample`` has no effect and will raise an error in xarray 0.20. Pass ``keep_attrs`` directly to the applied function, e.g. ``resample(...).mean(keep_attrs=True)``.
warnings.warn(
(161, 395, 256, 4)
# This produces a regional summary
regional_summary = west_coast_mean_temp_raster.groupby('time').mean([
"lat", "lon"])
regional_summary.plot(col="region",
marker="o",
color="grey",
markerfacecolor="purple",
markeredgecolor="purple",
col_wrap=2,
figsize=(12, 8))
plt.suptitle("Seasonal Temperature by Region", y=1.03)
plt.show()
# The data can then be easily converted to a dataframe
regional_summary.to_dataframe()
air_temperature | ||
---|---|---|
time | region | |
2059-12-01 00:00:00 | 53 | 287.588531 |
82 | 281.238739 | |
86 | 280.193787 | |
96 | 279.038940 | |
2060-03-01 00:00:00 | 53 | 294.914886 |
... | ... | ... |
2099-09-01 00:00:00 | 96 | 290.840363 |
2099-12-01 00:00:00 | 53 | 289.929504 |
82 | 284.486908 | |
86 | 282.051971 | |
96 | 280.577484 |
644 rows × 1 columns
Leave a Comment