Welcome to tripyview’s documentation!

tripyview.sub_mesh.compute_boundary_edges(e_i)[source]

–> compute edges that have only one adjacenbt triangle

Parameters:

e_i:

np.array([n2de x 3]), elemental array

Returns:

bnde:

tripyview.sub_mesh.compute_nod_in_elem2D(n2dn, e_i, do_arr=False)[source]

–> compute element indices list that contribute to cetain vertice

Parameters:

n2dn:

int, number of vertices

e_i:

np.array([n2de x 3]), elemental array

do_arr:

bool (default=False) shut output be list or numpy array

Returns:

nod_in_elem2D:

tripyview.sub_mesh.grid_cart3d(lon, lat, R=1.0, is_deg=False)[source]

–> compute 3d cartesian coordinates from spherical geo coordinates (lon, lat, R=1.0) |

Parameters:

lon:

array, longitude coordinates in radians

lat:

array, latitude coordinates in radians

R:

float, (default=1.0), Radius of sphere

is_deg:

bool, (default=False) is lon,lat in degree (True) otherwise otherwise (False) assumed its in radians

Returns:

x:

array, x y z cartesian coordinates

y:

z:

tripyview.sub_mesh.grid_cutbox_e(n_x, n_y, e_i, box, which='mid')[source]

–> cutout region based on box and return mesh elements indices that are within the box

Parameters:

nx:

longitude vertice coordinates

ny:

latitude vertice coordinates

e_i:

element array

box:

list, [lonmin, lonmax, latmin, latmax]

which:

str, how limiting should be the selection - ‘soft’ : elem with at least 1 vertices in box are selected - ‘mid’ : elem with at least 2 vertices in box are selected - ‘hard’ : elem with at all vertices in box are selected

Returns:

e_inbox:

array, boolian array with 1 in box, 0 outside box

tripyview.sub_mesh.grid_cutbox_n(n_x, n_y, box)[source]

–> cutout region based on box and return mesh elements indices that are within the box

Parameters:

nx:

longitude vertice coordinates

ny:

latitude vertice coordinates

e_i:

element array

box:

list, [lonmin, lonmax, latmin, latmax]

Returns:

n_inbox:

array, boolian array with 1 in box, 0 outside box

tripyview.sub_mesh.grid_focus(focus, rlon, rlat)[source]

–> compute grid rotation around z-axis to change the focus center of the lon, lat grid, by default focus=0–>lon=[-180…180], if focus=180–>lon[0..360]

Parameters:

focus:

float, longitude of grid center

rlon:

array, longitude in focus=0–>lon=[-180…180] in degree

rlat:

array, latitude in focus=0–>lon=[-180…180] in degree

Returns:

lon:

array, longitude in lon=[-180+focus…180+focus] frame in degree

lat:

array, latitude in lon=[-180+focus…180+focus] frame in degree

tripyview.sub_mesh.grid_g2r(abg, lon, lat)[source]

–> compute grid rotation from normal geo frame towards sperical rotated frame using the euler angles alpha, beta, gamma

Parameters:

abg:

list, with euler angles [alpha, beta, gamma]

lon:

array, longitude coordinates of normal geo frame in degree

lat:

array, latitude coordinates of normal geo frame in degree

Returns:

rlon:

array, longitude coordinates in sperical rotated frame in degree

rlat:

array, latitude coordinates in sperical rotated frame in degree

tripyview.sub_mesh.grid_interp_e2n(mesh, data_e)[source]

–> interpolate data from elements to vertices e.g velocity from elements to velocity on nodes

Parameter:

mesh:

fesom2 mesh object

data_e:

np.array with datas on elements either 2d or 3d

Returns:

data_n:

np.array with datas on vertices 2d or 3d

tripyview.sub_mesh.grid_r2g(abg, rlon, rlat)[source]

–> compute grid rotation from sperical rotated frame back towards normal geo frame using the euler angles alpha, beta, gamma

Parameters:

abg:

list, with euler angles [alpha, beta, gamma]

rlon:

array, longitude coordinates of sperical rotated frame in degree

rlat:

array, latitude coordinates of sperical rotated frame in degree

Returns:

lon:

array, longitude coordinates in normal geo frame in degree

lat:

array, latitude coordinates in normal geo frame in degree

tripyview.sub_mesh.grid_rotmat(abg)[source]

–> compute euler rotation matrix based on alpha, beta and gamma angle

Parameters:

abg:

list, with euler angles [alpha, beta, gamma]

Returns:

rmat:

array, [3 x 3] rotation matrix to transform from geo to rot

tripyview.sub_mesh.load_mesh_fesom2(meshpath, abg=[50, 15, -90], focus=0, cyclic=360, do_rot='None', do_augmpbnd=True, do_cavity=False, do_lsmask=True, do_lsmshp=True, do_earea=True, do_narea=True, do_eresol=[False, 'mean'], do_nresol=[False, 'e_resol'], do_loadraw=False, do_pickle=True, do_joblib=False, do_f14cmip6=False, do_info=True)[source]

–> load FESOM2 mesh

Parameters:

mespath:

str, path that leads to FESOM2.0 mesh files (.out)

abg:

list, [alpha,beta,gamma], (default=[50,15,-90]) euler angles used to rotate the grids within the model

focus:

float, (default=0) sets longitude center of the mesh, lon=[-180…180], focus=180 lon=[0…360]

cyclic:

float, (default=360.0), length of cyclic domain in lon degree can be different channel configuration

do_rot:

str, (default=’None’) should the grid be rotated, default: ‘None’ - None, ‘None’ … no rotation is applied - ‘r2g’ … loaded grid is rotated and transformed to geo - ‘g2r’ … loaded grid is geo and transformed to rotated

do_augmpbnd:

bool, (default=True) augment periodic boundary triangles, default: True

do_cavity:

bool, (default=False) load also cavity files cavity_nlvls.out and cavity_elvls.out

do_lsmask:

bool, (default=True) Compute land-sea mask polygon for FESOM2 mesh see mesh.lsmask, augments its periodic boundaries see mesh.lasmask_a and computes land sea mask patch see mesh.lsmask_p

do_lsmshp:

bool, (default=True) save land-sea mask with periodic boundnaries to shapefile

do_earea:

bool, (default=True) compute or load from fesom.mesh.diag.nc the area of elements

do_narea:

bool, (default=True) compute or load from fesom.mesh.diag.nc the clusterarea of vertices

do_eresol:

list([bool,str]), (default: [False,’mean’]) compute resolution based on elements, str can be… - “mean”: resolution based on mean element edge length, - “max”: resolution based on maximum edge length, - “min” resolution based on minimum edge length,

do_nresol:

list([bool,str]), (default: [False,’e_resol’]), compute resolution at nodes from interpolation of resolution at elements

do_loadraw:

bool, (default=False) also load the raw vertical level information for elements. Its the vertical level information before the exclusion of elements that have three boundary nodes in the topography

do_pickle:

bool, (default=True) store and load mesh from .pckl binary file, pickle5 is just supported until < python3.9. If pickel library cant be found it switches automatic to joblib

do_joblib:

bool, (default=False) store and load mesh from .joblib binary file

do_f14cmip6:

bool, (default=False) load FESOM1.4 mesh information and squeeze it into the framework of FESOM2. Needed here to compute AMOC on fesom1.4 cmorized CMIP6 data.

do_info:

bool, (default=True) print progress and mesh information

Returns:

mesh:

object, returns fesom_mesh object

tripyview.sub_mesh.lsmask_2shapefile(mesh, lsmask=[], path=[], fname=[], do_info=True)[source]

–> save FESOM2 grid land-sea mask polygons to shapefile

Parameters:

mesh:

fesom2 mesh object, contains periodic augmented land-sea mask polygonss in mesh.lsmask_a

lsmask:

list, if empty mesh.lsmask_a is stored in shapefile, if not empty lsmask=lsmaskin than lsmaskin will be stored in

path:

strm, if empty mesh.path (or cache path depending on writing permission) is used as location to store the shapefile, if path=pathin than this string serves as location to store .shp file

fname:

str, if empty fixed filename is used mypymesh_fesom2_ID_focus=X.shp, if not empty than fname=fnamein is used as filename for shape file

do_info:

bool, print info where .shp file is saved, default = True

Returns:

return:

nothing

Info:

–> to load and plot shapefile patches

import shapefile as shp
from matplotlib.patches import Polygon
from matplotlib.collections import PatchCollection

shpfname = 'tripyview_fesom2'+'_'+mesh.id+'_'+
        '{}={}'.format('focus',mesh.focus)+'.shp'
shppath  = os.path.join(mesh.cachepath,shpfname)

sf = shp.Reader(shppath)
patches = []
for shape in sf.shapes(): patches.append(Polygon(shape.points))

plt.figure()
ax = plt.gca()
ax.add_collection(PatchCollection(patches,
                facecolor=[0.7,0.7,0.7],
                edgecolor='k', linewidths=1.))
ax.set_xlim([-180,180]) 
ax.set_ylim([-90,90])
plt.show()
tripyview.sub_mesh.lsmask_patch(lsmask)[source]

–> computes polygon collection that can be plotted as closed polygon patches with ax.add_collection(PatchCollection(mesh.lsmask_p, facecolor=[0.7,0.7,0.7], edgecolor=’k’,linewidth=0.5))

Parameters:

lsmask: list()

list([array1[npts,2], array2[npts,2]], …)

array1=np.array([ [x1,y1]; [x2,y2]; … ])

Returns:

lsmask_p:

shapely Multipolygon object

Info:

  • how to plot in matplotlib: from descartes import PolygonPatch ax.add_patch(PolygonPatch(mesh.lsmask_p,facecolor=[0.7,0.7,0.7],

  • how to plot in cartopy: import cartopy.crs as ccrs ax.add_geometries(mesh.lsmask_p, crs=ccrs.PlateCarree(), facecolor=[0.6,0.6,0.6], edgecolor=’k’, linewidth=0.5)

class tripyview.sub_mesh.mesh_fesom2(meshpath, abg=[50, 15, -90], focus=0, cyclic=360, focus_old=0, do_rot='None', do_augmpbnd=True, do_cavity=False, do_info=True, do_earea=False, do_earea2=False, do_eresol=[False, 'mean'], do_narea=False, do_nresol=[False, 'n_area'], do_lsmask=True, do_lsmshp=True, do_pickle=True, do_loadraw=True, do_f14cmip6=False)[source]

Bases: object

–> Class that creates object that contains all information about FESOM2

mesh. As minimum requirement the mesh path to the files nod2d.out, elem2d.out and aux3d.out has to be given,


Parameters:

see help(load_fesom_mesh)


Variables:

path: str, path that leads to FESOM2.0 mesh files (.out)

id: str, identifies mesh

n2dn: int, number of 2d nodes

n2de: int, number of 2d elements

n_x: array, lon position of surface nodes

n_y: array, lat position of surface nodes

e_i: array, elemental array with 2d vertice indices, shape=[n2de,3]

___vertical info____________________________________

nlev: int, number of vertical full cell level

zlev: array, with full depth levels

zmid: array, with mid depth levels

n_z: array, bottom depth based on zlev[n_iz],

n_iz: array, number of full depth levels at vertices

e_iz: array, number of full depth levels at elem

___cavity info (if do_cavity==True)_________________

n_ic: array, full depth level index of cavity-ocean interface at vertices

e_ic: array, full depth level index of cavity-ocean interface at elem

n_c: array, cavity-ocean interface depth at vertices zlev[n_ic]

___area and resoltion info___________________________

n_area: array, area at vertices

n_resol: array, resolutionat vertices

e_area: array, area at elements

e_resol: array, resolution at elements

___periodic boundary augmentation____________________

n_xa: array, with augmented vertice paramters

n_ya: …

n_za: …

n_iza: …

n_ca: …

n_ica: …

e_ia: array, element array with augmented triangles –> np.vstack((mesh.e_i[mesh.e_pbnd_0,:],mesh.e_ia))

e_pbnd_1: array, elem indices of pbnd elements

e_pbnd_0: array, elem indices of not pbnd elements

e_pbnd_a: array, elem indices of periodic augmented elements –> data_plot = np.hstack((data_plot[mesh.e_pbnd_0],data_plot[mesh.e_pbnd_a]))

n_pbnd_a: array, vertice indices to augment pbnd –> data_plot = np.hstack((data_plot,data_plot[mesh.n_pbnd_a]))

n2dna: int, number of vertices with periodic boundary augmentation

n2dea: int, number of elements with periodic boundary augmentation

___land sea mask (if do_lsmask == True)______________

lsmask: list(array1[npts,2], array2[npts,2], …), contains all land-sea mask polygons for FESOM2 mesh, with periodic boundary

lsmask_a: list(array1[npts,2], array2[npts,2], …)contains all land-sea mask polygons for FESOM2 mesh, with augmented periodic boundary

lsmask_p: polygon, contains polygon collection that can be plotted as closed polygon patches with ax.add_collection(PatchCollection (mesh.lsmask_p,facecolor=[0.7,0.7,0.7], edgecolor=’k’, linewidth=0.5))


Returns:

mesh: object, returns fesom_mesh object


Info:

create matplotlib triangulation with augmented periodic boundary

tri = Triangulation(np.hstack((mesh.n_x,mesh.n_xa)), 
                    np.hstack((mesh.n_y,mesh.n_ya)),
                    np.vstack((mesh.e_i[mesh.e_pbnd_0,:],mesh.e_ia)))
augment_lsmask()[source]

–> part of fesom mesh class, split contourlines that span over the periodic boundary into two separated countourlines for the left and right side of the periodic boundaries

compute_e_area()[source]

–> part of fesom mesh class, either load area of elements from fesom.mesh.diag.nc if its found in meshpath or recompute it from scratch, [m^2]

compute_e_resol(which='mean')[source]

–> part of fesom mesh class, compute area of elements in [m], options:

Parameter:

which: str,
  • “mean” … resolution based on mean element edge legth

  • “max” … resolution based on maximum element edge length

  • “min” … resolution based on minimum element edge length

compute_lsmask()[source]

–> part of fesom mesh class, compute land-sea mask contourline with periodic boundary based on boundary edges that contribute only to one triangle and then checking which edges can be consequtive connected |

compute_n_area()[source]

–> part of fesom mesh class, either load clusterarea of vertices from fesom.mesh.diag.nc if its found in meshpath or recompute it from scratch by using e_area, [m^2]

compute_n_resol(which='n_area')[source]

–> part of fesom mesh class, compute resolution at vertices in m, options:

Parameter:

which: str,
  • “n_area” … compute resolution based on vertice cluster area

  • “e_resol” … compute vertice resolution by interpolating elem resolution to vertices, default

info()[source]
pbnd_augment()[source]

–> part of fesom mesh class, adds additional elements to augment the periodic boundary on the left and right side for an even non_periodic boundary is created left and right [-180, 180] of the domain

pbnd_find()[source]

–> part of fesom mesh class, find elements that cross over the periodic boundary

read_cavity()[source]

–> part of fesom mesh class, read cavity mesh files, cavity_nlvls.out cavity_elvls.out and cavity_elvls_raw.out (if do_loadraw=True)

read_mesh()[source]

–> part of fesom mesh class, read mesh files nod2d.out, elem2d.out, aux3d.out, nlvls.out and elvls.out

tripyview.sub_mesh.vec_r2g(abg, lon, lat, urot, vrot, gridis='geo', do_info=False)[source]

–> In FESOM2 the vector variables are usually given in the rotated coordinate frame in which the model works and need to be rotated into normal geo coordinates, however in the latest FESOM2 version there is also the option that they are rotated in the model via a flaf. So make sure what applies to you

Parameters:

abg:

list, with euler angles [alpha, beta, gamma]

lon:

array, longitude

lat:

array, latitude

urot:

array, zonal velocities in rotated frame

vrot:

array, meridional velocities in rotated frame

gridis:

str, in which coordinate frame are given lon, lat ‘geo’,’g’,’geographical’: lon,lat is given in geo coordinates ‘rot’,’r’,’rotated’ : lon,lat is given in rot coordinates

Returns:

ugeo:

array, zonal velocities in normal geo frame

vgeo:

array, meridional velocities in normal geo frame

tripyview.sub_plot.do_axes_arrange(nx, ny, xlabel='', ylabel='', tlabel='', fs_label=10.0, fs_title=10.0, fs_ticks=10.0, fs_fac=1, ax_sharex=True, ax_sharey=True, ax_optdict={}, ax_asp=1.0, ax_w='auto', ax_h=4.0, ax_dl=0.6, ax_dr=0.6, ax_dt=0.6, ax_db=0.6, ax_fdl=1.0, ax_fdr=1.0, ax_fdt=1.0, ax_fdb=1.0, ax_fw=1.0, ax_fh=1.0, fig_optdict={}, fig_sizefac=1.0, fig_dl=0.0, fig_dr=0.0, fig_dt=0.0, fig_db=0.0, fig_fdl=1.0, fig_fdr=1.0, fig_fdt=1.0, fig_fdb=1.0, cb_plt=True, cb_plt_single=True, cb_pos='vertical', cb_dl=0.6, cb_dr=3.0, cb_dt=0.6, cb_db=0.6, cb_fdl=1.0, cb_fdr=1.0, cb_fdt=1.0, cb_fdb=1.0, cb_w=0.5, cb_h='auto', cb_fw=1.0, cb_fh=1.0, projection=None, proj=None, box=None, nargout=['hfig', 'hax', 'hcb', 'cb_plt_idx'], **kwargs)[source]

–> do multipanel axes arangement


Parameters:

___LABEL OPTION____________________

xlabel:

str (default: ‘’) provide prescribed xlabel string

ylabel:

str (default: ‘’) provide prescribed ylabel string

tlabel:

str (default: ‘’) provide prescribed title string

fs_label:

int (default: 10) prescribed fontsize for labels

fs_title:

int (default: 10) prescribed fontsize for title

fs_ticks:

int (default: 10) prescribed fontsize for ticklabels

fs_fac:

int (default: 1) factor to generally increase fontsize

___AXES OPTIONS____________________

ax_sharex:

bool (default: True) all subplot share x-axes

ax_sharey:

bool (default: True) all subplot share y-axes

ax_optdict:

dict (default: dict()) additional axes option: fontssize, …

ax_asp:

float (default: 1.) aspect ratio of axes

ax_w:

float, int (default: ‘auto’) if ‘auto’ width is defined based on aspect ration ax_asp and height ax_h in cm, if float value is used to define width in cm

ax_h:

float, int (default: 4) if ‘auto’ height is defined based on aspect ration ax_asp and width ax_w in cm, if float value is used to define height in cm

ax_fw:

float (default: 1.0 factor to increase width spacing

ax_fh:

float (default: 1.0 factor to increase height spacing

ax_dl:

float (default: 0.6) left spacing around axes in cm

ax_dr:

float (default: 0.6) right spacing around axes in cm

ax_dt:

float (default: 0.6) top spacing around axes in cm

ax_db:

float (default: 0.6) bottom spacing around axes in cm

ax_fdl:

float (default: 1.0) factor to increase left axes spacing

ax_fdr:

float (default: 1.0) factor to increase right axes spacing

ax_fdt:

float (default: 1.0) factor to increase top axes spacing

ax_fdb:

float (default: 1.0) factor to increase bottom axes spacing

___FIGURE OPTION___________________

fig_optdict:

dict (default: dict()) additional figure option: fontssize, …

fig_sizefac:

float (default: 1.) factor to resize figures

fig_dl:

float (default: 0.6) left spacing around figure in cm

fig_dr:

float (default: 0.6) right spacing around figure in cm

fig_dt:

float (default: 0.6) top spacing around figure in cm

fig_db:

float (default: 0.6) bottom spacing around figure in cm

fig_fdl:

float (default: 1.0) factor to increase left figure spacing

fig_fdr:

float (default: 1.0) factor to increase right figure spacing

fig_fdt:

float (default: 1.0) factor to increase top figure spacing

fig_fdb:

float (default: 1.0) factor to increase bottom figure spacing

___COLORBAR OPTION_________________

cb_plt:

bool, list (default: True) If True colorbar is plotted to all axes (cb_plt_single=False) or just one colorbar is plotted for all axes (cb_plt_single=False), If list with 0 and 1 [0,1,0,1…], which axes should have colorbar, if number higher than 1 in list it is assumed there is more than one independed colorbar

cb_plt_single:

bool (default: False) true/false if there is just one colorbar for all the axes or if each axes should get a colorbar

cb_pos:

str (default: ‘vertical’) orientation of colorbar, either vertical or horizontal

cb_dl:

float (default: 0.6) left spacing around colorbar in cm

cb_dr:

float (default: 3.0) right spacing around colorbar in cm

cb_dt:

float (default: 1.0) top spacing around colorbar in cm

cb_db:

float (default: 0.6) bottom spacing around colorbar in cm

cb_fdl:

float (default: 1.0) factor to increase left colorbar spacing

cb_fdr:

float (default: 1.0) factor to increase right colorbar spacing

cb_fdt:

float (default: 1.0) factor to increase top colorbar spacing

cb_fdb:

float (default: 1.0) factor to increase bottom colorbar spacing

cb_w:

float, str (default: 0.5) if float it is used as width in cm, if ‘auto’ width is defined automatically based on width of axes in case of horizontal colorbar

cb_h:

float, str (default: 0.5) if float it is used as height in cm, if ‘auto’ height is defined automatically based on height of axes in case of vertical colorbar

cb_fw:

float (default: 1.0) factor to increase width of colorbar

cb_fh:

float (default: 1.0) factor to increase height colorbar

___PROJECTION_______________________

projection:

(default: None) provide single cartopy projection object to use for all axes, or provide list of cartopy projection objects

box:

None, list (default: None) regional limitation of plot. For ortho: box = [lonc, latc], nears: [lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]

___OUTPUT___________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’, ‘cb_plt_idx’]) list of variables that are given out from the routine. Default: - hfig … figure handle - hax … list of axes handle - hcb … list of colorbar handles - cb_plt_idx… list that contains index of independent - colorbars (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle

hcb:

returns colorbar handle

cb_plt_idx:

list that contains index of independent colorbars


tripyview.sub_plot.do_axes_enum(hax, do_enum, nrow, ncol, enum_dir='lr', enum_str=[], enum_x=[0.005], enum_y=[1.0], enum_opt={})[source]

–> do enumeration of axes

Parameters:

hax:

list, list of all axes handles

do_enum:

bool, switch for using enumeration

nrow:

int, number of rows in multi panel plot

:ncol : int, number of column in multi panel plot

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

enum_str:

list, (default: []) overwrite default enumeration strings ,

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

Returns:


tripyview.sub_plot.do_cbar(hcb_ii, hax_ii, hp, data, cinfo, do_rescale, cb_label, cb_lunit, cb_ltime, cb_ldep, box_idx=None, norm=None, cb_opt={}, cbl_opt={}, cbtl_opt={})[source]
–> plot colorbars (tripyview allows also to have more than one colorbar within the

multipanel plot)

Parameters:

hcb_ii:

actual colorbar handle

hax_ii:

actual axes handle

hp:

actual plot handle

data:

xarray dataset object with all attributes (needed for the default colorbar labels)

cinfo:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

cb_label:

str, (default: None) if string its used as colorbar label, otherwise information from data (‘long_name, short_name) are used

cb_lunit:

str, (default: None) if string its used as colorbar unit label, otherwise info from data are used

cb_ltime:

str, (default: None) if string its used as colorbar time label, otherwise info from data are used

cb_ldep:

str, (default: None) if string its used as colorbar depth label, otherwise info from data are used

box_idx:

None or index of box selection in data_ii[box_idx]

norm:

None or renormation object

cb_opt:

dict, (default: dict()) direct option for colorbar via kwarg

cbl_opt:

dict, (default: dict()) direct option for colorbar labels (fontsize, fontweight, …) via kwarg

cbtl_opt:

dict, (default: dict()) direct option for colorbar tick labels (fontsize, fontweight, …) via kwarg

Returns:

hcb_ii:

actual colorbar handle


tripyview.sub_plot.do_cbar_formatting(cbar, do_rescale, cinfo, pw_lim=[-3, 4], cbtl_opt={})[source]

–> do formating of colorbar for logarithmic data and exponential data

Parameters:

cbar:

actual colorbar handle

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

cinfo:

dict(), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

pw_lim:

list, in which decimal limits matplot will rescale the colorbar with 10^x

cbtl_opt:

dict, (default: dict()) direct option for colorbar tick labels (fontsize, fontweight, …) via kwarg

Returns:

cbar:

actual colorbar handle


tripyview.sub_plot.do_climit_hist(data_in, ctresh=0.99, cbin=1000, cweights=None)[source]

–> compute min/max value range by histogram, computation of cumulativ distribution function at certain cutoff treshold

Parameters:

data_in:

np.array with data to plot

ctresh:

cover 99% of value range, means that extreme outliers are cutted of

cbin:

number of bin that are used for the value range histogram

cweight:

provide are weights for the histogramm

Returns:

cmin:

return minimum value of value range

cmax:

return minimum value of value range


tripyview.sub_plot.do_data_norm(cinfo, do_rescale)[source]

–> prepare renormation object, for log10 or slog10

Parameters:

cinfo:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling (mcolors.LogNorm) - slog10 … do symetric logarithmic scaling (mcolors.SymLogNorm) - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps (mcolors.BoundaryNorm)

Returns:
which_norm:

None or renormation object


tripyview.sub_plot.do_data_prepare_unstruct(mesh, tri, data_plot, do_ie2n)[source]
–> prepare data for plotting, augment periodic boundaries, interpolate from elements

to nodes, kick out nan values from plotting

Parameters:

mesh:

fesom2 mesh object, with all mesh information

data_plot:

np.array of unstructured data

tri:

matplotlib.tri triangulation object - tri.mask_e_box … bool np.array with element masking from regional box definition - tri.mask_n_box … bool np.array with vertices masking from regional box selection

Returns:

data_plot:

np.array of unstructured data, augmented with periodic boundary, limited to regional box

tri:

matplotlib.tri triangulation object


tripyview.sub_plot.do_data_prepare_vslice(hax_ii, data_ii, box_idx)[source]
–> prepare data for plotting, augment periodic boundaries, interpolate from elements

to nodes, kick out nan values from plotting

Parameters:

hax:

handle of current axes

data_ii:

xarray dataset object of axes ii, can contains the info data_ii[box_idx] of several defined boxes which can be selected via box_idx index

box_idx:

index of box selection in data_ii[box_idx]

Returns:

data_x:

np.array, data for x-axis

data_y:

np.array, data fox y-axes

data_plot:

np.array, data to plot on regular vertical grid


tripyview.sub_plot.do_plt_bot(hax_ii, do_bot, tri=None, data_x=None, data_y=None, data_plot=None, ylim=None, bot_opt={})[source]

–> plot bottom mask

Parameters:

hax_ii:

handle of axes ii

do_bot:

bool, (default: True), overlay topographic bottom mask

tri:

matplotlib.tri triangulation object (default=None) - tri.mask_e_ok…provide mask with nan values, that describe the bottom limited to regional box

data_x:

regular longitude array (default=None)

data_y:

regular latitude array (default=None)

data_plot:

np.array of regular gridded data (default=None)

ylim:

list, (default=None), overwrite limit of yaxis

bot_opt:

dict, (default: dict()) additional options that are given to the bottom mask plotting via kwarg

Returns:

h0:

return handle of bottom plot


tripyview.sub_plot.do_plt_data(hax_ii, do_plt, tri, data_plot, cinfo_plot, which_norm_plot, plt_opt={}, plt_contb=False, pltcb_opt={}, plt_contf=False, pltcf_opt={}, plt_contr=False, pltcr_opt={}, plt_contl=False, pltcl_opt={})[source]

–> plot triangular data based on tripcolor or tricontourf

Parameters:

hax_ii:

handle of axes ii

do_plt:

str, (default: tpc) - tpc … make pseudocolor plot (tripcolor) - tcf … make contourf coor plot (tricontourf) , # tpc:tripcolor, tcf:tricontourf

tri:

matplotlib.tri triangulation object - tri.mask_e_ok…provide mask with nan values, that describe the bottom limited to regional box

data_plot:

np.array of unstructured data, augmented with periodic boundary,

cinfo_plot:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

which_norm_plot:

None or renormation object

plt_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

plt_contb:

bool, (default: False) overlay thin contour lines of all colorbar steps (background)

pltcb_opt:

dict, (default: dict()) background contour line option

plt_contf:

bool, (default: False) overlay thicker contour lines of the main colorbar steps (foreground)

pltcf_opt:

dict, (default: dict()) foreground contour line option

plt_contr:

bool, (default: False) overlay thick contour lines of reference color steps (reference)

pltcr_opt:

dict, (default: dict()) reference contour line option

plt_contl:

bool, (default: False) label overlayed contour linec plot

pltcl_opt:

dict, (default: dict()) additional options that are given to clabel via the kwarg argument

Returns:

h0:

return matplotlib handle of plot


tripyview.sub_plot.do_plt_datareg(hax_ii, do_plt, data_x, data_y, data_plot, cinfo_plot, which_norm_plot, plt_opt={}, which_transf=None, plt_contb=False, pltcb_opt={}, plt_contf=False, pltcf_opt={}, plt_contr=False, pltcr_opt={}, plt_contl=False, pltcl_opt={})[source]

–> plot regular gridded data (binned, coarse grained data) via pcolormesh and contourf

Parameters:

hax_ii:

handle of axes ii

do_plt:

str, (default: tpc) - tpc … make pseudocolor plot (tripcolor) - tcf … make contourf coor plot (tricontourf)

data_x:

regular longitude array

data_y:

regular latitude array

data_plot:

np.array of regular gridded data

cinfo_plot:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

which_norm:

None or renormation object

plt_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

plt_contb:

bool, (default: False) overlay thin contour lines of all colorbar steps (background)

pltcb_opt:

dict, (default: dict()) background contour line option

plt_contf:

bool, (default: False) overlay thicker contour lines of the main colorbar steps (foreground)

pltcf_opt:

dict, (default: dict()) foreground contour line option

plt_contr:

bool, (default: False) overlay thick contour lines of reference color steps (reference)

pltcr_opt:

dict, (default: dict()) reference contour line option

plt_contl:

bool, (default: False) label overlayed contour linec plot

pltcl_opt:

dict, (default: dict()) additional options that are given to clabel via the kwarg argument

Returns:

h0: return matplotlib handle of plot


tripyview.sub_plot.do_plt_gridlines(hax_ii, do_grid, box, ndat, data_x=None, data_y=None, xlim=None, ylim=None, grid_opt={}, proj=None, do_rescale=None)[source]
–> do plot cartopy gridline and general gridlines together with the limit

scaling of the axis (see non-linear option of x and y axis)

Parameters:

hax_ii:

handle of one axes

do_grid:

bool, (default: True) plot cartopy grid lines

box:

None, or list with box definitions

ndat:

int, total length of data list

data_x:

regular longitude array

data_y:

regular latitude array

xlim:

list, (default=None), overwrite limit of xaxis

ylim:

list, (default=None), overwrite limit of yaxis

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

proj:

None, cartopy projection object or string (e.g. index+depth+time…)

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

Returns:

h0:

None or handle


tripyview.sub_plot.do_plt_lsmask(hax_ii, do_lsm, mesh, lsm_opt={}, resolution='low')[source]

–> plot fesom mesh inverted land sea mask

Parameters:

hax_ii:

handle of axes ii

do_lsm:

str, (default: ‘fesom’), overlay FESOM grid inverted land sea mask option are here:

  • fesom … grey fesom landsea mask

  • stock … uses cartopy stock image

  • bluemarble … uses bluemarble image in folder tripyview/background/

  • etopo … uses etopo image in folder tripyview/background/

mesh:

fesom2 mesh object, with all mesh information

lsm_opt:

dict, (default: dict()) additional options that are given to the landsea mask plotting via kwarg

resolution:

str, (default: ‘low’) switch resolution of background image for bluemarble and etopo between ‘high’ and ‘low’

Returns:

h0:

return handle of plot·


tripyview.sub_plot.do_plt_mesh(hax_ii, do_mesh, tri, mesh_opt={})[source]

–> plot overlaying triangular mesh

Parameters:

hax_ii:

handle of axes ii

do_mesh:

bool, (default: True), overlay FESOM grid over dataplot

tri:

matplotlib.tri triangulation object - tri.mask_e_ok…provide mask with nan values, that describe the bottom limited to regional box

mesh_opt:

dict, (default: dict()) additional options that are given to the mesh plotting via kwarg

Returns:

h0:

return handle of plot


tripyview.sub_plot.do_plt_quiver(hax_ii, do_quiv, tri, data_plot_u, data_plot_v, cinfo_plot, norm_plot, quiv_scalfac=1, quiv_arrwidth=0.25, quiv_dens=0.4, quiv_smax=10, quiv_shiftL=2, quiv_smooth=2, quiv_opt={})[source]

–> plot triangular data as quiver plot

Parameters:

hax_ii:

handle of axes ii

do_quiv:

bool, do cartopy quiver plot

tri:

matplotlib.tri triangulation object - tri.mask_e_ok…provide mask with nan values, that describe the bottom limited to regional box

data_plot_u:

np.array of unstructured zonal vector component

data_plot_v:

np.array of unstructured meridional vector component

cinfo_plot:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

norm_plot:

None or renormation object

quiv_scalfac:

float, (default: 1.0) bigger means larger arrows

quiv_arrwidth:

float, (default: 0.25) scale arrow width

quiv_dens:

float, (default: 0.5) larger mean more excluded arrows

quiv_smax:

float, (default: 10) small arrow are scaled strong with factor smax, its off when smax=1

quiv_shiftL:

float, (default: 2) shift smothing function to the left

quiv_smooth:

float, (default: 2) slope of transitions zone, smaller value steeper transition

quiv_opt:

dict, (default: dict()) additional options that are given to quiver plot routine

Returns:

h0:

return handle of quiver plot


tripyview.sub_plot.do_plt_streaml_reg(hax_ii, ii, do_streaml, streaml_dat=None, streaml_opt={}, do_streaml_leg=True, streaml_leg_opt={}, box=None, which_transf=<Projected CRS: +proj=eqc +ellps=WGS84 +a=6378137.0 +lon_0=0.0 +to ...> Name: unknown Axis Info [cartesian]: - E[east]: Easting (unknown) - N[north]: Northing (unknown) - h[up]: Ellipsoidal height (metre) Area of Use: - undefined Coordinate Operation: - name: unknown - method: Equidistant Cylindrical Datum: Unknown based on WGS 84 ellipsoid - Ellipsoid: WGS 84 - Prime Meridian: Greenwich)[source]
–> plot streamlines over scalar data, based on regular gridded lon,lat vector

data. Data can originate either from corase graining or interpolation

Parameters:

hax_ii:

handle of axes ii

ii:

int, index of handle ii

do_streaml:

bool, do cartopy streaml plot

streaml_dat:

list of xr.Datasets with u,v data in it (default=None)

Dimensions:  (lat: 85, lon: 180, n2: 2)
Coordinates:
* lat      (lat) float32 340B -79.0 -77.0 -75.0 -73.0 ... 83.0 85.0 87.0 89.0
* lon      (lon) float32 720B -179.0 -177.0 -175.0 ... 175.0 177.0 179.0
    nzi      float64 8B 15.89
    nz1      int64 8B 250
    lat_bnd  (lat, n2) float32 680B -80.0 -78.0 -78.0 -76.0 ... 88.0 nan nan
    lon_bnd  (lon, n2) float32 1kB -180.0 -178.0 -178.0 ... 178.0 178.0 180.0
    w_A      (lat, lon) float32 61kB 9.437e+09 9.437e+09 9.437e+09 ... nan nan
Dimensions without coordinates: n2
Data variables:
    u        (lat, lon) float32 61kB -0.003229 -0.003186 -0.002711 ... nan nan
    v        (lat, lon) float32 61kB 0.0007711 0.000357 0.001531 ... nan nan
Attributes: (12/19)
    FESOM_model:                         FESOM2
    FESOM_website:                       fesom.de
    FESOM_git_SHA:                       02f7d080
    FESOM_MeshPath:                      /albedo/work/user/pscholz/mesh_fesom...
    FESOM_mesh_representative_checksum:  297ddf9c482ca68c86a979e1bd5d3c97
    FESOM_ClimateDataPath:               /albedo/work/projects/p_fesom/FROM-O...
    ...                                  ...
streaml_opt:

dict, (default: dict()) additional options that are given to streamline plot routine

do_streaml_leg:

bool, (default: False) overlay legend for streamline plot, keep in mind that the position for legend is somewhat pre setted by hand

streaml_leg_opt:

dict, (default=dict()) additional option for the position of the streamline legend

box:

None, list (default: None) regional limitation of plot. For ortho… box=[lonc, latc], nears…box=[lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]

which_transf:

ccrs.CRS (default=None) data starting projection usually ccrs.PlateCaree()

Returns:

h0:

return matplotlib handle of streamline

#___________________________________________________________________________

tripyview.sub_plot.do_plt_topo(hax_ii, do_topo, data_topo, mesh, tri, plt_opt={}, plt_contb=True, pltcb_opt={}, plt_contl=False, pltcl_opt={})[source]

–> plot topography contour or pcolor

Parameters:

hax_ii:

handle of axes ii

do_topo:

bool, (default: True), overlay model topography in quiver plots

data_topo:

np.array with unstructured data of model topogrpahy

mesh:

fesom2 mesh object, with all mesh information

tri:

matplotlib.tri triangulation object - tri.mask_e_box … bool np.array with element masking from regional box definition - tri.mask_n_box … bool np.array with vertices masking from regional box selection

plt_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

plt_contb:

bool, (default: False) overlay thin contour lines of all colorbar steps (background)

pltcb_opt:

dict, (default: dict()) background contour line option

plt_contl:

bool, (default: False) label overlayed contour linec plot

pltcl_opt:

dict, (default: dict()) additional options that are given to clabel via the kwarg argument

Returns:

h0:

return handle of plot


tripyview.sub_plot.do_projection(mesh, proj, box)[source]

–> set cartopy target projection

Parameters:

mesh:

fesom2 mesh object, with all mesh information

proj:

str, (default: ‘pc’) which projection should be used, - pc … PlateCarree (box=[lonmin, lonmax, latmin, latmax]) - merc … Mercator (box=[lonmin, lonmax, latmin, latmax]) - rob … Robinson (box=[lonmin, lonmax, latmin, latmax]) - eqearth… EqualEarth (box=[lonmin, lonmax, latmin, latmax]) - mol … Mollweide (box=[lonmin, lonmax, latmin, latmax]) - nps … NorthPolarStereo (box=[-180, 180, >0, latmax]) - sps … SouthPolarStereo (box=[-180, 180, latmin, <0]) - ortho … Orthographic (box=[loncenter, latcenter]) - nears … NearsidePerspective (box=[loncenter, latcenter, zoom]) - channel… PlateCaree

box:

None, list (default: None) regional limitation of plot. For ortho box = [lonc, latc], nears [lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]

Returns:

proj_to:

cartopy projection object

box:

return projection adapted box list


tripyview.sub_plot.do_reindex_vert_and_elem(tri, e_box_mask)[source]

–> reindex element index in case of exluded triangles and unused vertices

Parameters:

tri:

matplotlib.tri triangulation object

e_box_mask:

bool np.array with element masking from regional box definition

Returns:

tri:

matplotlib.tri triangulation object where unreferenced triangles are kickout and the indices of the remaining vertices are adapted accordingly in the element list

n_box_mask:

bool np.array with vertices masking from regional box selection


tripyview.sub_plot.do_savefigure(do_save, hfig, dpi=300, do_info=True, save_opt={})[source]

–> save figure to file

Parameters:

do_save:

None, str (default:None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

hfig:

figure handle

dpi:

int, (default:600), resolution of image

save_opt:

dict, (default: dict()) direct option for saving via kwarg

Returns:


tripyview.sub_plot.do_setupcinfo(cinfo, data, do_rescale, mesh=None, tri=None, do_vec=False, do_index=False, do_moc=False, do_dmoc=None, do_hbstf=False, box_idx=0)[source]

–> build up colormap dictionary

Parameters:

cinfo:

dict(), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

data:

xarray dataset object

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

mesh:

None or fesom2 mesh object,

tri:

None or matplotlib.tri triangulation object

do_vec:

bool (default: False) flag,input data are vector data

do_index:

bool (default: False) flag,input data are index data

do_moc:

bool (default: False) flag,input data are zmoc data

do_dmoc:

str (default: None ) str, input data are dmoc data (‘inner’, ‘dmoc’, ‘srf’)

do_hbstf:

bool (default: False)

box_idx:

in case input data are list of regional shapefile boxes, this is the index of a specific box

Returns:

cinfo:

None, dict() (default: None), dictionary with colorbar info


tripyview.sub_plot.do_triangulation(hax, mesh, proj_to, box, proj_from=None, do_triorig=False, do_narea=True, do_earea=False)[source]

–> create matplotlib triangulation object

Parameters:

hax:

handle of current axes

mesh:

fesom2 mesh object, with all mesh information

proj_to:

cartopy destination projection object

proj_from:

cartopy source projection object (default: ccrs.PlateCarree())

do_triorig:

bool, (default=False) save original vertices coordinate in lon,lat space

do_narea:

bool (default=True), drag vertices area with you is needed for normalisation

do_earea:

bool (default=True), drag element area with you is needed for normalisation

Returns:

tri:

matplotlib.tri triangulation object


tripyview.sub_plot.plot_hline(data, box=None, box_idx=None, box_label=None, boxl_opt={}, nrow=1, ncol=1, proj='index+xy', n_cycl=None, do_allcycl=False, do_shdw=True, do_mean=False, do_rescale=False, plt_opt={}, mark_opt={}, do_grid=True, grid_opt={}, ax_title='descript', ax_xlabel=None, ax_ylabel=None, ax_xunit=None, ax_yunit=None, ax_opt={}, ax_xlim=None, ax_ylim=None, do_enum=False, enum_opt={'horizontalalignment': 'center'}, enum_str=[], enum_x=[0.0], enum_y=[1.005], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax'])[source]

–> do plotting of horizontal lines over index region (e.g. heatflux vs lon, lat)


Parameters:

data:

xarray dataset object, or list of xarray dataset object

box:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts) boxname] index region defined by polygon

box_idx:

int, (default: None) index in boxlist

box_label:

str (default: None) overwrites boxname string

boxl_opt:

dict (default: dict() additional options for boxlabel string (fontsize…)

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: ‘index+xy’)

n_cycl:

int, (default: None) How many spinup cycles are contained in the data_list. Info is needed when do_allcycl=True,

do_allcycl:

bool, (default: False) plot all spinupcycles based on colormap value

do_shdw:

bool, (default: True) give lines a black outline

do_mean:

bool, (default: False) plot triangle on the left side that indicates the mean value

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling

___plot data parameters_____________________________

plt_opt:

dict, (default: dict()) additional options that are given to line plot routine via the kwarg argument

mark_opt:

dict, (default: dict()) additional options that are given to control the line markers via the kwarg argument

___plot gridlines___________________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_xlabel:

str, (default: None) overwrites default xlabel

ax_ylabel:

str, (default: None) overwrites default ylabel

ax_xunit:

str, (default: None) overwrites default xlabel unit

ax_yunit:

str, (default: None) overwrites default ylabel unit

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

ax_xlim:

list (default: None) overright data related xlimits

ax_ylim:

list (default: None) overright data related ylimits

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle


tripyview.sub_plot.plot_hmesh(mesh, data=None, box=None, cinfo=None, nrow=1, ncol=1, proj='pc', do_ie2n=False, do_rescale=False, do_plt='tpc', plt_opt={}, plt_contb=False, pltcb_opt={}, plt_contf=False, pltcf_opt={}, plt_contr=False, pltcr_opt={}, plt_contl=False, pltcl_opt={}, do_mesh=True, mesh_opt={}, do_lsm='fesom', lsm_opt={}, lsm_res='low', do_grid=True, do_boundbox=True, grid_opt={}, cb_label=None, cb_lunit=None, cb_ltime=None, cb_ldep=None, cb_opt={}, cbl_opt={}, cbtl_opt={}, ax_title=None, ax_opt={}, do_enum=False, enum_opt={}, enum_str=[], enum_x=[0.005], enum_y=[1.0], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax', 'hcb'])[source]

–> plot horizontal mesh and mesh paramters on vertices and elements


Parameters:

mesh:

fesom2 mesh object, with all mesh information

data:

str, (default: None) string can be:

  • ‘resolution’, ‘resol’, ‘n_resol’, ‘nresol’

  • ‘narea’, ‘n_area’, ‘clusterarea’, ‘scalararea’

  • ‘eresol’, ‘e_resol’, ‘triresolution’, ‘triresol’

  • ‘earea’, ‘e_area’, ‘triarea’

  • ‘ndepth’, ‘ntopo’, ‘n_depth’, ‘n_topo’, ‘topography’, ‘zcoord’

  • ‘edepth’, ‘etopo’, ‘e_depth’, ‘e_topo’

box:

None, list (default: None) regional limitation of plot. For ortho… box=[lonc, latc], nears…box=[lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]

cinfo:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: ‘pc’) which projection should be used, - pc … PlateCarree (box=[lonmin, lonmax, latmin, latmax]) - merc … Mercator (box=[lonmin, lonmax, latmin, latmax]) - rob … Robinson (box=[lonmin, lonmax, latmin, latmax]) - eqearth… EqualEarth (box=[lonmin, lonmax, latmin, latmax]) - mol … Mollweide (box=[lonmin, lonmax, latmin, latmax]) - nps … NorthPolarStereo (box=[-180, 180, >0, latmax]) - sps … SouthPolarStereo (box=[-180, 180, latmin, <0]) - ortho … Orthographic (box=[loncenter, latcenter]) - nears … NearsidePerspective (box=[loncenter, latcenter, zoom]) - channel… PlateCaree

do_ie2n:

bool, (default: False) do interpolation of data on elements towards nodes

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

___plot data parameters_____________________________

do_plt:

str, (default: tpc) - tpc … make pseudocolor plot (tripcolor) - tcf … make contourf color plot (tricontourf)

plt_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

plt_contb:

bool, (default: False) overlay thin contour lines of all colorbar steps (background)

pltcb_opt:

dict, (default: dict()) background contour line option

plt_contf:

bool, (default: False) overlay thicker contour lines of the main colorbar steps (foreground)

pltcf_opt:

dict, (default: dict()) foreground contour line option

plt_contr:

bool, (default: False) overlay thick contour lines of reference color steps (reference)

pltcr_opt:

dict, (default: dict()) reference contour line option

plt_contl:

bool, (default: False) label overlayed contour linec plot

pltcl_opt:

dict, (default: dict()) additional options that are given to clabel via the kwarg argument

___plot mesh________________________________________

do_mesh:

bool, (default: True), overlay FESOM grid over dataplot

mesh_opt:

dict, (default: dict()) additional options that are given to the mesh plotting via kwarg

___plot bottom mask_________________________________

do_bot:

bool, (default: True), overlay topographic bottom mask

bot_opt:

dict, (default: dict()) additional options that are given to the bottom mask plotting via kwarg

___plot lsmask______________________________________

do_lsm:

str, (default: ‘fesom’), overlay FESOM grid inverted land sea mask option are here:

  • fesom … grey fesom landsea mask

  • stock … uses cartopy stock image

  • bluemarble … uses bluemarble image in folder tripyview/background/

  • etopo … uses etopo image in folder tripyview/background/

lsm_opt:

dict, (default: dict()) additional options that are given to the landsea mask plotting via kwarg

lsm_res:

str, (default=’low’) resolution of bluemarble texture file either ‘low’ or ‘high’

___plot cartopy gridlines____________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

do_boundbox:

bool, (default: True) plot cartopy black bounding box. If you make plots as a texture to use in blender, unity or pyvista the bounding box has to be removed with do_boundbox=False

___colorbar_________________________________________

cb_label:

str, (default: None) if string its used as colorbar label, otherwise information from data (‘long_name, short_name) are used

cb_lunit:

str, (default: None) if string its used as colorbar unit label, otherwise info from data are used

cb_ltime:

str, (default: None) if string its used as colorbar time label, otherwise info from data are used

cb_ldep:

str, (default: None) if string its used as colorbar depth label, otherwise info from data are used

cb_opt:

dict, (default: dict()) direct option for colorbar via kwarg

cbl_opt:

dict, (default: dict()) direct option for colorbar labels (fontsize, fontweight, …) via kwarg

cbtl_opt:

dict, (default: dict()) direct option for colorbar tick labels (fontsize, fontweight, …) via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

axl_opt:

dict, (default: dict()) set option for axes labels (fontsize, …)

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings ,

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • hcb … list of colorbar handles (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle

hcb:

returns colorbar handle


tripyview.sub_plot.plot_hquiver(mesh, data, box=None, cinfo=None, nrow=1, ncol=1, proj='pc', do_ie2n=False, do_rescale=False, do_quiv=True, quiv_opt={}, quiv_scalfac=1, quiv_arrwidth=0.25, quiv_dens=0.5, quiv_smax=10, quiv_shiftL=2, quiv_smooth=2, do_mesh=False, mesh_opt={}, do_bot=True, bot_opt={}, do_topo='tpc', topo_opt={}, topo_cont=True, topoc_opt={}, topo_contl=False, topocl_opt={}, do_lsm='fesom', lsm_opt={}, lsm_res='low', do_grid=True, do_boundbox=True, grid_opt={}, cb_label=None, cb_lunit=None, cb_ltime=None, cb_ldep=None, cb_opt={}, cbl_opt={}, cbtl_opt={}, ax_title='descript', ax_opt={}, do_enum=False, enum_opt={}, enum_str=[], enum_x=[0.005], enum_y=[1.0], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax', 'hcb'])[source]

–> plot FESOM2 horizontal data slice as quiver plot:


Parameters:

mesh:

fesom2 mesh object, with all mesh information

data:

xarray dataset object, or list of xarray dataset object

box:

None, list (default: None) regional limitation of plot. For ortho… box=[lonc, latc], nears…box=[lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]

cinfo:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: ‘pc’) which projection should be used, - pc … PlateCarree (box=[lonmin, lonmax, latmin, latmax]) - merc … Mercator (box=[lonmin, lonmax, latmin, latmax]) - rob … Robinson (box=[lonmin, lonmax, latmin, latmax]) - eqearth… EqualEarth (box=[lonmin, lonmax, latmin, latmax]) - mol … Mollweide (box=[lonmin, lonmax, latmin, latmax]) - nps … NorthPolarStereo (box=[-180, 180, >0, latmax]) - sps … SouthPolarStereo (box=[-180, 180, latmin, <0]) - ortho … Orthographic (box=[loncenter, latcenter]) - nears … NearsidePerspective (box=[loncenter, latcenter, zoom]) - channel… PlateCaree

do_ie2n:

bool, (default: False) do interpolation of data on elements towards nodes

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

___plot quiver parameters___________________________

do_quiv:

bool, (default: True), do cartopy quiver plot

quiv_opt:

dict, (default: dict()) additional options that are given to quiver plot routine

quiv_scalfac:

float, (default: 1.0) bigger means larger arrows

quiv_arrwidth:

float, (default: 0.25) scale arrow width

quiv_dens:

float, (default: 0.5) larger mean more excluded arrows

quiv_smax:

float, (default: 10) small arrow are scaled strong with factor smax, its off when smax=1

quiv_shiftL:

float, (default: 2) shift smothing function to the left

quiv_smooth:

float, (default: 2) slope of transitions zone, smaller value steeper transition

___plot mesh________________________________________

do_mesh:

bool, (default: True), overlay FESOM grid over dataplot

mesh_opt:

dict, (default: dict()) additional options that are given to the mesh plotting via kwarg

___plot bottom mask_________________________________

do_bot:

bool, (default: True), overlay topographic bottom mask

bot_opt:

dict, (default: dict()) additional options that are given to the bottom mask plotting via kwarg

___plot topo________________________________________

do_topo:

str, (default: tpc) = - tpc … make pseudocolor plot (tripcolor) - tcf … make contourf coor plot (tricontourf) , # tpc:tripcolor, tcf:tricontourf

topo_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

topo_cont:

bool, (default: False) overlay contour line plot of data

topoc_opt:

dict, (default: dict()) additional options that are given to tricontour via the kwarg argument

topo_contl:

bool, (default: False) label overlayed contour linec plot

topocl_opt:
dict, (default: dict()) additional options that are given to

clabel via the kwarg argument

___plot lsmask______________________________________

do_lsm:

str, (default: ‘fesom’), overlay FESOM grid inverted land sea mask option are here:

  • fesom … grey fesom landsea mask

  • stock … uses cartopy stock image

  • bluemarble … uses bluemarble image in folder tripyview/background/

  • etopo … uses etopo image in folder tripyview/background/

lsm_opt:

dict, (default: dict()) additional options that are given to the landsea mask plotting via kwarg

lsm_res:

str, (default=’low’) resolution of bluemarble texture file either ‘low’ or ‘high’

___plot cartopy gridlines____________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

do_boundbox:

bool, (default: True) plot cartopy black bounding box. If you make plots as a texture to use in blender, unity or pyvista the bounding box has to be removed with do_boundbox=False

___colorbar_________________________________________

cb_label:

str, (default: None) if string its used as colorbar label, otherwise information from data (‘long_name, short_name) are used

cb_lunit:

str, (default: None) if string its used as colorbar unit label, otherwise info from data are used

cb_ltime:

str, (default: None) if string its used as colorbar time label, otherwise info from data are used

cb_ldep:

str, (default: None) if string its used as colorbar depth label, otherwise info from data are used

cb_opt:

dict, (default: dict()) direct option for colorbar via kwarg

cbl_opt:

dict, (default: dict()) direct option for colorbar labels (fontsize, fontweight, …) via kwarg

cbtl_opt:

dict, (default: dict()) direct option for colorbar tick labels (fontsize, fontweight, …) via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

axl_opt:

dict, (default: dict()) set option for axes labels (fontsize, …)

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings ,

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • hcb … list of colorbar handles (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig: returns figure handle

hax: returns list with axes handle

hcb: returns colorbar handle


tripyview.sub_plot.plot_hslice(mesh, data, box=None, cinfo=None, nrow=1, ncol=1, proj='pc', do_ie2n=False, do_rescale=False, do_plt='tpc', plt_opt={}, plt_contb=False, pltcb_opt={}, plt_contf=False, pltcf_opt={}, plt_contr=False, pltcr_opt={}, plt_contl=False, pltcl_opt={}, do_mesh=False, mesh_opt={}, do_streaml=False, streaml_dat=None, streaml_opt={}, do_streaml_leg=True, streaml_leg_opt={}, do_bot=True, bot_opt={}, do_lsm='fesom', lsm_opt={}, lsm_res='low', do_grid=True, do_boundbox=True, grid_opt={}, cb_label=None, cb_lunit=None, cb_ltime=None, cb_ldep=None, cb_opt={}, cbl_opt={}, cbtl_opt={}, ax_title='descript', ax_opt={}, axl_opt={}, do_enum=False, enum_opt={}, enum_str=[], enum_x=[0.005], enum_y=[1.0], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax', 'hcb'])[source]

–> plot FESOM2 horizontal data slice:


Parameters:

mesh:

fesom2 mesh object, with all mesh information

data:

xarray dataset object, or list of xarray dataset object

box:

None, list (default: None) regional limitation of plot. For ortho… box=[lonc, latc], nears…box=[lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]

cinfo:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: ‘pc’) which projection should be used, - pc … PlateCarree (box=[lonmin, lonmax, latmin, latmax]) - merc … Mercator (box=[lonmin, lonmax, latmin, latmax]) - rob … Robinson (box=[lonmin, lonmax, latmin, latmax]) - eqearth… EqualEarth (box=[lonmin, lonmax, latmin, latmax]) - mol … Mollweide (box=[lonmin, lonmax, latmin, latmax]) - nps … NorthPolarStereo (box=[-180, 180, >0, latmax]) - sps … SouthPolarStereo (box=[-180, 180, latmin, <0]) - ortho … Orthographic (box=[loncenter, latcenter]) - nears … NearsidePerspective (box=[loncenter, latcenter, zoom]) - channel… PlateCaree

do_ie2n:

bool, (default: False) do interpolation of data on elements towards nodes

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

___plot data parameters_____________________________

do_plt:

str, (default: tpc) - tpc … make pseudocolor plot (tripcolor) - tcf … make contourf color plot (tricontourf)

plt_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

plt_contb:

bool, (default: False) overlay thin contour lines of all colorbar steps (background)

pltcb_opt:

dict, (default: dict()) background contour line option

plt_contf:

bool, (default: False) overlay thicker contour lines of the main colorbar steps (foreground)

pltcf_opt:

dict, (default: dict()) foreground contour line option

plt_contr:

bool, (default: False) overlay thick contour lines of reference color steps (reference)

pltcr_opt:

dict, (default: dict()) reference contour line option

plt_contl:

bool, (default: False) label overlayed contour linec plot

pltcl_opt:

dict, (default: dict()) additional options that are given to clabel via the kwarg argument

___plot mesh________________________________________

do_mesh:

bool, (default: True), overlay FESOM grid over dataplot

mesh_opt:

dict, (default: dict()) additional options that are given to the mesh plotting via kwarg

___plot streamlines_________________________________

do_streaml:

bool, (default: False), overlay streamline plot based on regular gridded u,v data

streaml_dat:

list, (default: None), list with u,v regular gridded velocity data

streaml_opt:

dict, (default: dict()), additional options that can be given to hter streamline plot_hline

do_streaml_leg:

bool, (default: False) overlay legend for streamline plot, keep in mind that the position for legend is somewhat pre setted by hand

streaml_leg_opt:

dict, (default=dict()) additional option for the position of the streamline legend

streaml_leg_opt = dict({'x':75, # lon position of legend in deg
                        'y':60, # lat position of legend in deg 
                        'dy':5, # vertical distance of legend labels in deg
                        'dw':10, # width of lines in deg
                        'arr_s': [0.05, 0.1, 0.2, 0.3, 0.4]  })

___plot bottom mask_________________________________

do_bot:

bool, (default: True), overlay topographic bottom mask

bot_opt:

dict, (default: dict()) additional options that are given to the bottom mask plotting via kwarg

___plot lsmask______________________________________

do_lsm:

str, (default: ‘fesom’), overlay FESOM grid inverted land sea mask option are here:

  • fesom … grey fesom landsea mask

  • stock … uses cartopy stock image

  • bluemarble … uses bluemarble image in folder tripyview/background/

  • etopo … uses etopo image in folder tripyview/background/

lsm_opt:

dict, (default: dict()) additional options that are given to the landsea mask plotting via kwarg

lsm_res:

str, (default=’low’) resolution of bluemarble texture file either ‘low’ or ‘high’

___plot cartopy gridlines____________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

do_boundbox:

bool, (default: True) plot cartopy black bounding box. If you make plots as a texture to use in blender, unity or pyvista the bounding box has to be removed with do_boundbox=False

___colorbar_________________________________________

cb_label:

str, (default: None) if string its used as colorbar label, otherwise information from data (‘long_name, short_name) are used

cb_lunit:

str, (default: None) if string its used as colorbar unit label, otherwise info from data are used

cb_ltime:

str, (default: None) if string its used as colorbar time label, otherwise info from data are used

cb_ldep:

str, (default: None) if string its used as colorbar depth label, otherwise info from data are used

cb_opt:

dict, (default: dict()) direct option for colorbar via kwarg

cbl_opt:

dict, (default: dict()) direct option for colorbar labels (fontsize, fontweight, …) via kwarg

cbtl_opt:

dict, (default: dict()) direct option for colorbar tick labels (fontsize, fontweight, …) via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

axl_opt:

dict, (default: dict()) set option for axes labels (fontsize, …)

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings ,

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • hcb … list of colorbar handles (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle

hcb:

returns colorbar handle


tripyview.sub_plot.plot_tline(data, box, box_idx=None, box_label=None, boxl_opt={}, nrow=1, ncol=1, proj='index+time', n_cycl=None, do_allcycl=False, do_concat=False, do_shdw=True, do_mean=True, do_std=False, plt_opt={}, mark_opt={}, do_grid=True, grid_opt={}, ax_title='descript', ax_xlabel=None, ax_ylabel=None, ax_xunit=None, ax_yunit=None, ax_opt={}, axl_opt={}, ax_xlim=None, ax_ylim=None, do_enum=False, enum_opt={}, enum_str=[], enum_x=[0.005], enum_y=[1.0], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax'])[source]

–> do plotting of mean indices over time (e.g. time-series)


Parameters:

data:

xarray dataset object, or list of xarray dataset object

box:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts) boxname] index region defined by polygon

box_idx:

int, (default: None) index in boxlist

box_label:

str (default: None) overwrites boxname string

boxl_opt:

dict (default: dict() additional options for boxlabel string (fontsize…)

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: ‘index+xy’)

n_cycl:

int, (default: None) How many spinup cycles are contained in the data_list. Info is needed when do_allcycl=True,

do_allcycl:

bool, (default: False) plot all spinupcycles based on colormap value

do_concat:

bool, (default: False) attache time-series of the the same spinup cycle behind each other. Create one long spinup time-series

do_shdw:

bool, (default: True) give lines a black outline

do_mean:

bool, (default: False) plot triangle on the left side that indicates the mean value

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling

___plot data parameters_____________________________

plt_opt:

dict, (default: dict()) additional options that are given to line plot routine via the kwarg argument

mark_opt:

dict, (default: dict()) additional options that are given to control the line markers via the kwarg argument

___plot gridlines___________________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_xlabel:

str, (default: None) overwrites default xlabel

ax_ylabel:

str, (default: None) overwrites default ylabel

ax_xunit:

str, (default: None) overwrites default xlabel unit

ax_yunit:

str, (default: None) overwrites default ylabel unit

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

ax_xlim:

list (default: None) overright data related xlimits

ax_ylim:

list (default: None) overright data related ylimits

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle


tripyview.sub_plot.plot_vline(data, box=None, box_idx=None, box_label=None, boxl_opt={}, cinfo=None, nrow=1, ncol=1, proj='index+depth', n_cycl=None, do_allcycl=False, do_shdw=True, do_mean=False, do_rescale=False, plt_opt={}, mark_opt={}, do_grid=True, grid_opt={'yexp': True}, ax_title='descript', ax_xlabel=None, ax_ylabel=None, ax_xunit=None, ax_yunit=None, ax_opt={}, ax_xlim=None, ax_ylim=None, do_enum=False, enum_opt={'horizontalalignment': 'center'}, enum_str=[], enum_x=[0.0], enum_y=[1.005], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax'])[source]

–> do plotting of mean indices over depth (e.g. vertical profiles)


Parameters:

data:

xarray dataset object, or list of xarray dataset object

box:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts) boxname] index region defined by polygon

box_idx:

int, (default: None) index in boxlist

box_label:

str, (default: None) overwrites boxname string

boxl_opt:

dict, (default: dict() additional options for boxlabel string (fontsize…)

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: ‘index+xy’)

n_cycl:

int, (default: None) How many spinup cycles are contained in the data_list. Info is needed when do_allcycl=True,

do_allcycl:

bool, (default: False) plot all spinupcycles based on colormap value

do_shdw:

bool, (default: True) give lines a black outline

do_mean:

bool, (default: False) plot triangle on the left side that indicates the mean value

do_rescale:
bool, str, np.array (defaul: False) do scaling of colorbar
  • False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3

  • log10 … do logaritmic scaling

  • slog10 … do symetric logarithmic scaling

___plot data parameters_____________________________

plt_opt:

dict, (default: dict()) additional options that are given to line plot routine via the kwarg argument

mark_opt:

dict, (default: dict()) additional options that are given to control the line markers via the kwarg argument

___plot gridlines___________________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_xlabel:

str, (default: None) overwrites default xlabel

ax_ylabel:

str, (default: None) overwrites default ylabel

ax_xunit:

str, (default: None) overwrites default xlabel unit

ax_yunit:

str, (default: None) overwrites default ylabel unit

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

ax_xlim:

list (default: None) overright data related xlimits

ax_ylim:

list (default: None) overright data related ylimits

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout: list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given

out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle


tripyview.sub_plot.plot_vslice(mesh, data, box=None, box_idx=None, box_label=None, boxl_opt={}, cinfo=None, nrow=1, ncol=1, proj=None, do_ie2n=False, do_rescale=False, do_plt='tpc', plt_opt={}, plt_contb=False, pltcb_opt={}, plt_contf=False, pltcf_opt={}, plt_contr=False, pltcr_opt={}, plt_contl=False, pltcl_opt={}, do_mesh=False, mesh_opt={}, do_bot=True, bot_opt={}, do_lsm='fesom', lsm_opt={}, lsm_res='low', do_grid=True, grid_opt={'yexp': True}, cb_label=None, cb_lunit=None, cb_ltime=None, cb_ldep=None, cb_opt={}, cbl_opt={}, cbtl_opt={}, ax_title='descript', ax_opt={}, ax_xlim=None, ax_ylim=None, do_enum=False, enum_opt={'horizontalalignment': 'center'}, enum_str=[], enum_x=[0.0], enum_y=[1.005], enum_dir='lr', do_save=None, save_dpi=300, save_opt={}, nargout=['hfig', 'hax', 'hcb'])[source]

–> plot FESOM2 horizontal data slice:


Parameters:

mesh:

fesom2 mesh object, with all mesh information

data:

xarray dataset object, or list of xarray dataset object

box:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts), boxname] index region defined by polygon

box_idx:

int, (default: None) index in boxlist

box_label:

str (default: None) overwrites boxname string

boxl_opt:

dict (default: dict() additional options for boxlabel string (fontsize…)

cinfo:

None, dict() (default: None), dictionary with colorbar information. Information that are given are used, others are computed. cinfo dictionary entries can be,

  • cinfo[‘cmin’], cinfo[‘cmax’], cinfo[‘cref’] … scalar min, max, reference value

  • cinfo[‘crange’] … list with [cmin, cmax, cref] overrides scalar values

  • cinfo[‘cnum’] … minimum number of colors

  • cinfo[‘cstr’] … name of colormap see in sub_colormap_c2c.py

  • cinfo[‘cmap’] … colormap object (‘wbgyr’, ‘blue2red, ‘jet’ …)

  • cinfo[‘clevel’] … color level array

nrow:

int, (default: 1) number of columns when plotting multiple data panels

ncol:

int, (default: 1) number of rows when plotting multiple data panels

proj:

str, (default: None) is choosen here autometically by data attribute proj, can be setted also from hand to … - index+depth+xy - index+depth+time - zmoc - dmoc - dmoc+depth - dmoc+dens

do_ie2n:

bool, (default: False) do interpolation of data on elements towards nodes

do_rescale:

bool, str, np.array (defaul: False) do scaling of colorbar - False … scale data automatically scientifically by 10^x, for data data larger 10^3 and smaller 10^-3 - log10 … do logaritmic scaling - slog10 … do symetric logarithmic scaling - np.array() … scale colorbar stepwise according to values in np.array allows also for non-linear colortick steps

___plot data parameters_____________________________

do_plt:

str, (default: tpc) - tpc … make pseudocolor plot (tripcolor) - tcf … make contourf color plot (tricontourf)

plt_opt:

dict, (default: dict()) additional options that are given to tripcolor or tricontourf via the kwarg argument

plt_contb:

bool, (default: False) overlay thin contour lines of all colorbar steps (background)

pltcb_opt:

dict, (default: dict()) background contour line option

plt_contf:

bool, (default: False) overlay thicker contour lines of the main colorbar steps (foreground)

pltcf_opt:

dict, (default: dict()) foreground contour line option

plt_contr:

bool, (default: False) overlay thick contour lines of reference color steps (reference)

pltcr_opt:

dict, (default: dict()) reference contour line option

plt_contl:

bool, (default: False) label overlayed contour linec plot

pltcl_opt:

dict, (default: dict()) additional options that are given to clabel via the kwarg argument

___plot mesh________________________________________

do_mesh:

bool, (default: True), overlay FESOM grid over dataplot

mesh_opt:

dict, (default: dict()) additional options that are given to the mesh plotting via kwarg

___plot bottom mask_________________________________

do_bot:

bool, (default: True), overlay topographic bottom mask

bot_opt:

dict, (default: dict()) additional options that are given to the bottom mask plotting via kwarg

___plot cartopy gridlines____________________________

do_grid:

bool, (default: True) plot cartopy grid lines

grid_opt:

dict, (default: dict()) additional options that are given to the cartopy gridline plotting via kwarg

___colorbar_________________________________________

cb_label:

str, (default: None) if string its used as colorbar label, otherwise information from data (‘long_name, short_name) are used

cb_lunit:

str, (default: None) if string its used as colorbar unit label, otherwise info from data are used

cb_ltime:

str, (default: None) if string its used as colorbar time label, otherwise info from data are used

cb_ldep:

str, (default: None) if string its used as colorbar depth label, otherwise info from data are used

cb_opt:

dict, (default: dict()) direct option for colorbar via kwarg

cbl_opt:

dict, (default: dict()) direct option for colorbar labels (fontsize, fontweight, …) via kwarg

cbtl_opt:

dict, (default: dict()) direct option for colorbar tick labels (fontsize, fontweight, …) via kwarg

___axes______________________________________

ax_title:

str, (default: ‘descript’) If ‘descript’ use descript attribute in data to title label axes, If ‘str’ use this string to label axes

ax_opt:

dict, (default: dict()) set option for axes arangement see subroutine do_axes_arrange

axl_opt:

dict, (default: dict()) set option for axes labels (fontsize, …)

___enumerate_________________________________

do_enum:

bool, (default: False) do enumeration of axes with a), b), c) …

enum_opt:

dict, (default: dict()) direct option for enumeration strings via kwarg

enum_str:

list, (default: []) overwrite default enumeration strings ,

enum_x:

float, (default: 0.005) x position of enumeration string in axes coordinates

enum_y:

float, (default: 1.000) y position of enumeration string in axes coordinates

enum_dir:

str, (default: ‘lr’) direction of numbering, ‘lr’ from left to right, ‘ud’ from up to down

___save figure________________________________

do_save:

str, (default: None) if None figure will by not saved, if string figure will be saved, strings must give directory and filename where to save.

save_dpi:

int, (default: 300) dpi resolution at which the figure is saved

save_opt:

dict, (default: dict()) direct option for saving via kwarg

___set output_________________________________

nargout:

list, (default: [‘hfig’, ‘hax’, ‘hcb’]) list of variables that are given out from the routine. Default:

  • hfig … figure handle

  • hax … list of axes handle

  • hcb … list of colorbar handles (every variable that is defined in this subroutine can become output parameter)


Returns:

hfig:

returns figure handle

hax:

returns list with axes handle

hcb:

returns colorbar handle


tripyview.sub_plot.set_cinfo(cstr, cnum, crange, cmin, cmax, cref, cfac, climit, chist, ctresh)[source]

–> initialise cinfo dictionary

Parameters:

cstr:

provide colormap string, can be either own defined colormap see sub_colormap.py (‘blue2red’, ‘wbgyr’,…), it can be also a matplotlib colormap (‘matplotlib.viridis’, ‘matplotlib.coolwarm’, …), or a cmocean colormap (‘cmocean.dens’, ‘cmocean.thermal’….)

cnum:

minimum number of colors to use

crange:

list of min/max/ref colorrange [cmin, cmax, cref]

cmin:

set min value of colorrange

cmax:

set max value of colorrange

cref:

set reference value (center value of colorrange)

cfac:

factor to multiply cmin and cmax

climit:

provide list with cmin cmax value [cmin, cmax], reference value is determined autonom

chist:

True/Fals if colorrange should be limited by histogram to exclude extreme outliers

ctresh:

how much percent of the colorrange should be covered, default is 99%

Returns:

cinfo:

None, dict() (default: None), dictionary with colorbar info


tripyview.sub_data.do_additional_attrs(data, vname, attr_dict)[source]

–> write additional information to variable attributes

Parameters:

data:

xarray dataset object

vname:

str, (default: None), variable name that should be loaded

attr_dict:

dict with different infos that are written to dataset variable attributes

Returns:

data:

xarray dataset object


tripyview.sub_data.do_anomaly(data1, data2)[source]

–> compute anomaly between two xarray Datasets

Parameters:

data1:

xarray dataset object

data2:

xarray dataset object

Returns:

anom:

xarray dataset object, data1-data2


tripyview.sub_data.do_comp_sel_levidx(zlev, depth, depidx, ndimax)[source]

–> compute level indices that are needed to interpolate the depth levels

Parameters:

zlev:

list, depth vector of the datas

depth:

list, with depth levels that should be interpolated

depidx:

bool, (default:False) if depth is int and depidx=True, depth index is selected that is closest to depth. No interpolation will be done

ndimax:

int, maximum number of levels mesh.n_iz.max()-1 for mid depth datas, mesh.n_iz.max() for full level data

Returns:

sel_levidx:

list, with level indices that should be extracted


tripyview.sub_data.do_depth_arithmetic(data, do_zarithm, dim_name)[source]

–> do arithmetic on depth dimension

Parameters:
data:

xarray dataset object

do_zarithm:

str (default=’mean’) do arithmetic on selected vertical layers options are

  • None, ‘None’

  • ‘mean’

  • ‘max’

  • ‘min’

  • ‘sum’

  • ‘wint’

  • ‘wmean’

dim_name:

str, name of depth dimension, is different for full-level and mid-level data

Returns:

data:

xarray dataset object


tripyview.sub_data.do_fnamemask(do_file, vname, runid, year)[source]

–> contains filename mask to distinguish between run, restart and blowup file that can be loaded

Parameters:

do_file:

str, which kind of file should be loaded, ‘run’,’restart_oce’, ‘restart_ice’, ‘blowup’

vname:

str, name of variable

runid:

str, runid of simulation usually ‘fesom’

year:

int, year number that should be loaded

Returns:

fname:

str, filename


tripyview.sub_data.do_gridinfo_and_weights(mesh, data, do_hweight=True, do_zweight=False)[source]

–> apply all coordinate information to the dataset. Apply vertice/ element centroids lon/lat positions, horizontal area weights for the scalar and vector cell and vertical depth weight

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

data:

xarray dataset object

do_hweight:

bool, (default=True) store weightsd for weighted horizontal averages

do_zweight:

bool, (default=False) store weights for vertical weigthed averages within the dataset

Returns:

data:

xarray dataset object, with coordinate + weight information

tripyview.sub_data.do_horiz_arithmetic(data, do_harithm, dim_name)[source]

–> do arithmetic on depth dimension

Parameters:

data:

xarray dataset object

do_harithm:

str (default=’mean’) do arithmetic on selected vertical layers options are

  • None, ‘None’

  • ‘mean’ … arithmetic mean

  • ‘median’

  • ‘std’

  • ‘var’

  • ‘max’

  • ‘min’

  • ‘sum’ … arithmetic sum

  • ‘wint’ … weighted horizontal integral int( DATA*dxdy)

  • ‘wmean’ … weighted horizontal mean

dim_name:

str, name of depth dimension, is different for full-level and mid-level data

Returns:

data:

xarray dataset object

tripyview.sub_data.do_interp_e2n(data, mesh, do_ie2n)[source]

–> interpolate data on elements to vertices

Parameters:

data:

xarray dataset object

mesh:

fesom2 mesh object

do_ie2n:

bool, True/False if interpolation should be applied

Returns:

data:

xarray dataset object


tripyview.sub_data.do_pathlist(year, datapath, do_filename, do_file, vname, runid)[source]

–> create path/file list of data that should be loaded

Parameters:

year:

int, list, np.array, range of years that should be loaded

datapath:

str, path that leads to the FESOM2 data

do_filename:

bool, (default=None) load this specific filname string instead

fo_file:

str, which kind of file should be loaded, ‘run’, ‘restart_oce’, ‘restart_ice’, ‘blowup’

vname:

str, name of variable

runid:

str, runid of simulation usually ‘fesom’

Returns:

pathlist:

str, list


tripyview.sub_data.do_potential_density(data, do_pdens, vname, vname2, vname_tmp)[source]

–> compute potential density based on temp and salt

Parameters:

data:

xarray dataset object, containing temperature and salinity data

do_pdens:

bool, should potential densitz be compute_boundary_edges

vname:

str, name of temperature variable in dataset

vname2:

str, name of salinity variable in dataset

vname_tmp:

str, which potential density should be computed - ‘sigma0’ … pref=0 - ‘sigma1’ … pref=1000 - ‘sigma2’ … pref=2000 - ‘sigma3’ … pref=3000 - ‘sigma4’ … pref=4000 - ‘sigma5’ … pref=5000

Returns:

data:

xarray dataset object, containing potential density

vname:

str, string with variable name of potential density


tripyview.sub_data.do_select_levidx(data, mesh, depth, depidx)[source]

–> select vertical levels based on depth list

Parameters:

data:

xarray dataset object

mesh:

fesom2 mesh object

depth:

int, list, np.array, range (default=None). Select single depth level that will be interpolated or select list of depth levels that will be interpolated and vertically averaged. If None all vertical layers in data are loaded

depidx:

bool, (default:False) if depth is int and depidx=True, depth index is selected that is closest to depth. No interpolation will be done

Returns:

data:

xarray dataset object


tripyview.sub_data.do_select_time(data, mon, day, record, str_mtim)[source]

–> select specific month, dayy or record number

Paramters:

data:

xarray dataset object

mon:

list, (default=None), specific month that should be selected from dataset. If mon selection leads to no data selection, because data maybe annual, selection is set to mon=None

day:

list, (default=None), same as mon but for day

record:

int, list, (default=None), load specific record number from dataset. Overwrites effect of mon and sel_day

str_mtim:

str, time label string input here contains already selection from years ‘y:1958-2019’, now add info from selction of month or days

Returns:

data:

returns xarray dataset object

mon:

list, with mon that got selected otherwise None

day:

list, with day that got selected otherwise None

str_ltim:

str, with selected time information


tripyview.sub_data.do_setbottomnan(mesh, data, do_nan)[source]

–> replace bottom fill values with nan (default value is zero)

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

data:

xarray dataset object

do_nan:

bool, do replace bottom fill values with nan

Returns:

data:

xarray dataset object


tripyview.sub_data.do_time_arithmetic(data, do_tarithm)[source]

–> do arithmetic over time dimension

Parameters:

data:

xarray dataset object

do_tarithm:

str (default=’mean’) do time arithmetic on time selection option are

  • None, ‘None’

  • ‘mean’ mean over entire time dimension

  • ‘median’

  • ‘std’

  • ‘var’

  • ‘max’,

  • ‘min’,

  • ‘sum’

  • ‘ymean’,’annual’ mean over year dimension

  • ‘mmean’,’monthly’ mean over month dimension

  • ‘dmean’,’daily’ mean over day dimension

Returns:

data:

xarray dataset object

str_atim:

str, which time arithmetic was applied


tripyview.sub_data.do_vector_norm(data, do_norm)[source]

–> compute vector norm: vname=’vec+u+v’

Parameters:

data:

xarray dataset object

do_norm:

bool, should vector norm be computed

Returns:

data:

xarray dataset object


tripyview.sub_data.do_vector_rotation(data, mesh, do_vec, do_vecrot, do_sclrv)[source]

–> compute roration of vector: vname=’vec+u+v’

Parameters:

data:

xarray dataset object

mesh:

fesom2 mesh object

do_vec:

bool, should data be considered as vectors

do_vecrot:

bool, should rotation be applied

Returns:

data:

xarray dataset object


tripyview.sub_data.load_data_fesom2(mesh, datapath, vname=None, year=None, mon=None, day=None, record=None, depth=None, depidx=False, do_tarithm='mean', do_zarithm='mean', do_zweight=False, do_hweight=True, do_nan=True, do_ie2n=True, do_vecrot=True, do_filename=False, do_file='run', descript='', runid='fesom', do_prec='float32', do_f14cmip6=False, do_compute=False, do_load=True, do_persist=False, do_parallel=False, chunks={'edg_n': 'auto', 'elem': 'auto', 'ndens': 'auto', 'nod2': 'auto', 'nz': 'auto', 'nz1': 'auto', 'time': 'auto'}, do_showtime=False, do_info=True, **kwargs)[source]

–> general loading of fesom2 and fesom14cmip6 data

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

datapath:

str, path that leads to the FESOM2 data

vname:

str, (default: None), variable name that should be loaded

year:

int, list, np.array, range, default: None, single year or list/array of years whos file should be opened

mon:

list, (default=None), specific month that should be selected from dataset. If mon selection leads to no data selection, because data maybe annual, selection is set to mon=None

day:

list, (default=None), same as mon but for day

record:

int,list, (default=None), load specific record number from dataset. Overwrites effect of mon and sel_day

depth:

int, list, np.array, range (default=None). Select single depth level that will be interpolated or select list of depth levels that will be interpolated and vertically averaged. If None all vertical layers in data are loaded

depidx:

bool, (default:False) if depth is int and depidx=True, depth index is selected that is closest to depth. No interpolation will be done

do_tarithm:

str (default=’mean’) do time arithmetic on time selection option are: None, ‘None’, ‘mean’, ‘median’, ‘std’, ‘var’, ‘max’ ‘min’, ‘sum’

do_zarithm:

str (default=’mean’) do arithmetic on selected vertical layers options are: None, ‘None’, ‘mean’, ‘max’, ‘min’, ‘sum’

do_zweight:

bool, (defaull=False) store weights for vertical weigthed averages within the dataset

do_hweight:

bool, (default=True), store weightsd for weighted horizontal averages

do_nan:

bool (default=True), do replace bottom fill values with nan

do_ie2n:

bool (default=True), if data are on elements automatically interpolates them to vertices –> easier to plot

do_vecrot:

bool (default=True), if vector data are loaded e.g. vname=’vec+u+v’ rotates the from rotated frame (in which they are stored) to geo coordinates

do_filename:

str, (default=None) load this specific filname string instead of path selection via datapath and year

do_file:

str, (default=’run’), which data should be loaded options are

  • ‘run’ … fesom2 simulation files should be load,

  • ‘restart_oce’ … fesom2 ocean restart file should be loaded,

  • ‘restart_ice’ … fesom2 ice restart file should be loaded

  • ‘blowup’ … fesom2 ocean blowup file will be loaded

descript:

str (default=’’), string to describe dataset is written into variable attributes

runid:

str (default=’fesom’), runid of loaded data

do_prec:

str, (default=’float32’) which precision is used for the loading of the data

do_f14cmip6:

bool, (default=False), Set to true when loading cmorized FESOM1.4 CMIP6 data when computing AMOC

do_compute:

bool (default=False), do xarray dataset compute() at the end data = data.compute(), creates a new dataobject the original data object seems to persist

do_load:

bool (default=True), do xarray dataset load() at the end data = data.load(), applies all operations to the original dataset

do_persist:

bool (default=False), do xarray dataset persist() at the end data = data.persist(), keeps the dataset as dask array, keeps the chunking

chunks:

dict(), (default=dict({‘time’:’auto’, …})) dictionary with chunksize of specific dimensions. By default setted to auto but can also be setted to specific value. In my observation it revealed that the loading of data was a factor 2-3 faster with auto-chunking but this might has its limitation for very large meshes

do_showtime:

bool, (default=False) show time information stored in dataset

do_info:

bool (defalt=True), print variable info at the end

Returns:

data:

object, returns xarray dataset object


tripyview.sub_transect.calc_transect_scalar(mesh, data, transects, nodeinelem=None, do_transectattr=False, do_info=False)[source]
–> interpolate scalar vertice values onto cutting point position where

cross-section intersects with edge

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

data:

object, with xarray dataset object containing 3d vertice values. Can also be done with scalar datas on elements but than nodeinelem is needed

transects: list with analysed transect dictionary information computed by

do_analyse _trasnsects

nodeinelem: np.array with elem indices that contribute to a vertice

point (default=None)

do_transectattr:

bool, (default=True) write full transect info into return xarray dataset attribute

do_info:

bool, (default=True), write info

Return:

data:

list with returned xarray dataset object that contains to the cutting point interpolated scalar values of transect


tripyview.sub_transect.calc_transect_transp(mesh, data, transects, do_transectattr=False, do_rot=True, do_info=True)[source]

–> Compute fesom2 modell accurate transport through defined transect

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

data:

object, with xarray dataset object with 3d element zonal and meridional velocities

transects:

list with analysed transect dictionary information computed by do_analyse _trasnsects

do_transectattr:

bool, (default=True) write full transect info into return xarray dataset attribute

do_rot:

bool, (default=True), do rotation of velocities from rotated to geo coordinates

do_info:

bool, (default=True), write info

Return:

data:

list with returned xarray dataset object that contain volume transport through every section of the transport path


tripyview.sub_transect.do_analyse_transects(input_transect, mesh, edge, edge_tri, edge_dxdy_l, edge_dxdy_r, do_rot=True, do_info=False, Pdxy=10.0)[source]
–> pre-analyse defined transects, with respect to which triangles and edges

are crossed by the transect line. Create transport path edge to compute modell accurate volume transports.

Parameters:

input_transect:

list, to define transects, transects= [[[lon pts], [lat pts], name], […]]

mesh:

fesom2 mesh object, with all mesh information

edge:

np.array([2, nedges]), edge array with indices of participating vertice points. Best load info from fesom.mesh.diag.nc

edge_tri:

np.array([2, nedges]), edge triangle array of indices of triangles that participate in the edge [2 x nedges]. If edge is boundary edge edge_tri[1, idx] = -1. Best load info from fesom.mesh.diag.nc

edge_dxdy_l:

np.array([2, nedges]), array with dx, dy cartesian length of distance from edge mid points to centroid of left side triangle from edge. Best load info from fesom.mesh.diag.nc

edge_dxdy_r:

np.array([2, nedges]), array with dx, dy cartesian length of distance from edge mid points to centroid of right side triangle from edge. Best load info from fesom.mesh.diag.nc

do_rot:

bool, (default=True) assume that the the edge_dxdy_l, edge_dxdy_r arrays are in the rotated coordinate frame and needed to be rotated into geo coordinates

do_info:

bool, (default=False) print info of transect dictionary

Pdxy:

float, (default=15) buffer lon/lat width of lon/lat box around transect for preselection of cutted edges within this box. On very coarse meshes if it looks like that certain edges are not cutted increase this number

Return:

transect_list:

list of transect dictionary

transect dictionary keys:

#_______________________________________________________________________
# arrays that define cross-section 
transect['Name'         ] = [] # Name of transect
transect['lon'          ] = [] # transect defining longitude list
transect['lat'          ] = [] # transect defining latitude list
transect['ncsi'         ] = [] # running index of number of defined transects
transect['ncs'          ] = [] # number transect defining points
transect['Px'           ] = [] # lon points  that define the transect edges 
transect['Py'           ] = [] # lat points  that define the transect edges 
transect['e_vec'        ] = [] # unit vector of transect edges
transect['e_norm'       ] = [] # length of unit vector (length of edge)
transect['n_vec'        ] = [] # normal vector of transect edges
transect['alpha'        ] = [] # bearing of transect edge

#_______________________________________________________________________
# arrays that define the intersection between cross-section and edges
transect['edge_cut_i'   ] = [] # indice of edges that are cutted by transect
transect['edge_cut_evec'] = [] # unit vector of those cutted edges
transect['edge_cut_P'   ] = [] # lon, lat point where transect cutted with edge
transect['edge_cut_midP'] = [] # mid points of cutted edge
transect['edge_cut_lint'] = [] # interpolator for cutting points on edge
transect['edge_cut_ni'  ] = [] # node indices of intersectted edges
transect['edge_cut_dist'] = [] # distance of cutted edge midpoint from start point of transect

#_______________________________________________________________________
# arrays to define transport path
transect['path_xy'      ] = [] # lon/lat coordinates, edge midpoints --> elem centroid --> edge mid points ...
transect['path_ei'      ] = [] # elem indices
transect['path_ni'      ] = [] # node indices of elems 
transect['path_dx'      ] = [] # dx of path sections
transect['path_dy'      ] = [] # dy of path sections
transect['path_dist'    ] = [] # dy of path sections
transect['path_nvec_cs' ] = [] # normal vector of transection segment

tripyview.sub_transect.do_transect_anomaly(index1, index2)[source]

–> compute anomaly between two transect list xarray Datasets

Parameters:

index1:

list with transect xarray dataset object

index2:

list with transect xarray dataset object

Returns:

anom:

list with transect xarray dataset object, data1-data2


tripyview.sub_transect.load_zmeantransect_fesom2(mesh, data, box_list, dlat=0.5, boxname=None, diagpath=None, do_checkbasin=False, do_compute=False, do_load=True, do_persist=False, do_info=False, **kwargs)[source]

–> compute zonal means transect, defined by regional box_list

Parameters:

mesh:

fesom2 mesh object, with all mesh information

data:

xarray dataset object, or list of xarray dataset object with 3d vertice data

box_list:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts), boxname] index region defined by polygon

dlat:

float, (default=0.5) resolution of latitudinal bins

diagpath:

str, (default=None), path to diagnostic file only needed when w_A weights for area average are not given in the dataset, than he will search for the diagnostic file_loader

do_checkbasin:

bool, (default=False) additional plot with selected region/ basin information

do_compute:

bool (default=False), do xarray dataset compute() at the end data = data.compute(), creates a new dataobject the original data object seems to persist

do_load:

bool (default=True), do xarray dataset load() at the end data = data.load(), applies all operations to the original dataset

do_persist:

bool (default=False), do xarray dataset persist() at the end data = data.persist(), keeps the dataset as dask array, keeps the chunking

do_info:

bool (defalt=False), print variable info at the end

Returns:

index_list:

list with xarray dataset of zonal mean array


tripyview.sub_transect.plot_transect_position(mesh, transect, edge=None, zoom=None, fig=None, figsize=[10, 10], proj='nears', box=[-180, 180, -90, 90], resolution='low', do_path=True, do_labels=True, do_title=True, do_grid=False, ax_pos=[0.9, 0.05, 0.45, 0.45])[source]

–> plot transect positions

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

transects:

list with analysed transect dictionary information computed by do_analyse _trasnsects

edge:

provide np.array with node indices of edge (default=None)

zoom:

float, (default=None), zzom factor for nearside projection

fig:

matplotlib figure handle, (default=None)

figsize:

list, (default=[10,10]) width and height of figure

proj:

str, (default=’nears’), can be any other projections string

  • pc … PlateCarree (box=[lonmin, lonmax, latmin, latmax])

  • merc … Mercator (box=[lonmin, lonmax, latmin, latmax])

  • rob … Robinson (box=[lonmin, lonmax, latmin, latmax])

  • eqearth… EqualEarth (box=[lonmin, lonmax, latmin, latmax])

  • mol … Mollweide (box=[lonmin, lonmax, latmin, latmax])

  • nps … NorthPolarStereo (box=[-180, 180, >0, latmax])

  • sps … SouthPolarStereo (box=[-180, 180, latmin, <0])

  • ortho … Orthographic (box=[loncenter, latcenter])

  • nears … NearsidePerspective (box=[loncenter, latcenter, zoom])

  • channel… PlateCaree

box:

None, list (default: [-180, 180,-90, 90]) regional limitation of plot. For ortho… box=[lonc, latc], nears…box=[lonc, latc, zoom], for all others box = [lonmin, lonmax, latmin, latmax]. For nears box is computed based on transect definition.

do_path:

bool, (default=True) plot entire transport path

do_labels:

bool, (default=True) draw lon, lat axes do_labels

do_title:

bool, (default=True) draw title with name of transect

do_grid:

bool, (default=False) plot fesom mesh in background

Returns:

hfig:

returns figure handle

hax:

returns axes handle


tripyview.sub_index.do_indexanomaly(index1, index2)[source]

–> compute anomaly between two index xarray Datasets

Parameters:

data1:

index xarray dataset object

data2:

index xarray dataset object

Returns:

anom:

index xarray dataset object, data1-data2


tripyview.sub_index.load_index_fesom2(mesh, data, box_list, boxname=None, do_harithm='wmean', do_zarithm=None, do_checkbasin=False, do_compute=False, do_load=True, do_persist=False)[source]

–> compute index over region from 2d and 3d vertice data

Parameters:

mesh:

fesom2 mesh object, with all mesh information

data:

xarray dataset object, or list of xarray dataset object with 3d vertice data

box_list:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts), boxname] index region defined by polygon

do_harithm:

str, (default=’wmean’) which horizontal arithmetic should be applied over the index definition region

  • ‘wmean’ … area weighted mean

  • ‘wint’ … area weighted integral

do_zarithm:

str, (default=None) which arithmetic should be applied in the vertical

  • ‘wmean’ … depth weighted mean

  • ‘wint’ … depth weighted integral

do_checkbasin:

bool, (default=False) additional plot with selected region/ basin information

do_compute:

bool (default=False), do xarray dataset compute() at the end data = data.compute(), creates a new dataobject the original data object seems to persist

do_load:

bool (default=True), do xarray dataset load() at the end data = data.load(), applies all operations to the original dataset

do_persist:

bool (default=False), do xarray dataset persist() at the end data = data.persist(), keeps the dataset as dask array, keeps the chunking

do_info:

bool (defalt=False), print variable info at the end

Returns:


tripyview.sub_index.plot_index_region(mesh, idx_IN, box_list, which='hard')[source]

–> plot index definition region

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

idx_IN:

list with bool np.array which vertices are within index defintion region

box_list:

None, list (default: None) list with regional box limitation for index computation box can be:

  • [‘global’] … compute global index

  • [shp.Reader] … index region defined by shapefile

  • [ [lonmin,lonmax,latmin, latmax], boxname] index region defined by rect box

  • [ [ [px1,px2…], [py1,py2,…]], boxname] index region defined by polygon

  • [ np.array(2 x npts), boxname] index region defined by polygon

which:

str, (default=hard) - ‘soft’ plot triangles that at least one selected vertice in them - ‘hard’ plot triangles that at all three selected vertice in them - ‘mid’ plot triangles that have more than one selected vertice in them

Returns:

None:


tripyview.sub_zmoc.calc_bottom_patch(data, lat_bin, idx_iz, idxin)[source]

–> compute idealized AMOC bottom path based on 3d data bathymetry structure

Parameters:

data:

xarray MOC data object

lat_bin:

xarray Array with latitudinal bins

idx_iz:
xarray Array with 2d bottom zlevel index. Can be for

elements or vertices

idx_iz = xr.DataArray(mesh.e_iz, dims=['elem']), or 
idx_iz = xr.DataArray(mesh.n_iz, dims=['nod2'])
idxin:

bool index array for regional basin selection

Returns:

data:

xarray MOC data object with additional coord: botnice


tripyview.sub_zmoc.calc_zmoc(mesh, data, dlat=1.0, which_moc='gmoc', do_onelem=False, diagpath=None, do_checkbasin=False, do_parallel=False, n_workers=10, do_compute=False, do_load=True, do_persist=False, do_info=True, **kwargs)[source]
–> calculate meridional overturning circulation from vertical velocities

(Pseudostreamfunction) either on vertices or elements

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

data:

xarray dataset object with 3d vertical velocities

dlat:

float (default=1.0), latitudinal binning resolution

which_moc:

str, shp.Reader() (default=’gmox’) which global or regional MOC should be computed based on present day shapefiles. ·Options are:

  • ‘gmoc’ … compute global MOC

  • ‘amoc’ … compute MOC for Atlantic Basin

  • ‘aamoc’ … compute MOC for Atlantic+Arctic Basin

  • ‘pmoc’ … compute MOC for Pacific Basin

  • ‘ipmoc’ … compute MOC for Indo-Pacific Basin (PMOC how it should be)

  • ‘imoc’ … compute MOC for Indian-Ocean Basin

  • shp.Reader(‘path’) … compute MOC based on custom shapefile

Important: Between ‘amoc’ and ‘aamoc’ there is not much difference in variability, but upto 1.5Sv in amplitude. Where ‘aamoc’ is stronger than ‘amoc’. There is no clear rule which one is better, just be sure you are consistent

do_onelem:

bool (default=False) … should computation be done on elements or vertices

:diagpath str (default=None) if str give custom path to specific fesom2

fesom.mesh.diag.nc file, if None routine looks automatically in meshfolder and original datapath folder (stored as attribute in) xarray dataset object

do_checkbasin:

bool (default=False) provide plot with regional basin selection

do_parallel:

bool (default=False) do computation of binning based MOC in parallel

n_workers:

int (default=10) how many worker (CPUs) are used for the parallelized MOC computation

do_compute:

bool (default=False), do xarray dataset compute() at the end data = data.compute(), creates a new dataobject the original data object seems to persist

do_load:

bool (default=True), do xarray dataset load() at the end data = data.load(), applies all operations to the original dataset

do_persist:

bool (default=False), do xarray dataset persist() at the end data = data.persist(), keeps the dataset as dask array, keeps the chunking

do_info:

bool (defalt=True), print variable info at the end

Returns:

zmoc:

object, returns xarray dataset object with MOC

data_list = list()

data = tpv.load_data_fesom2(mesh, datapath, vname='w', year=year, descript=descript, 
                            do_zarithm=None, do_nan=False, do_load=False, do_persist=True)

zmoc = tpv.calc_zmoc(mesh, data, dlat=1.0, which_moc='amoc')

data_list.append( zmoc )

tripyview.sub_dmoc.calc_dmoc(mesh, data_dMOC, dlat=1.0, which_moc='gmoc', which_transf=None, do_checkbasin=False, exclude_meditoce=False, do_bolus=True, do_parallel=False, n_workers=10, do_compute=False, do_load=True, do_persist=False, do_info=True, do_dropvar=True, **kwargs)[source]
–> calculate meridional overturning circulation from vertical velocities

(Pseudostreamfunction) either on vertices or elements

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

data_dMOC:

xarray dataset object with 3d density class data

dlat:

float (default=1.0), latitudinal binning resolution

which_moc:

str, shp.Reader() (default=’gmox’) which global or regional MOC should be computed based on present day shapefiles. ·Options are:

  • ‘gmoc’ … compute global MOC

  • ‘amoc’ … compute MOC for Atlantic Basin

  • ‘aamoc’ … compute MOC for Atlantic+Arctic Basin

  • ‘pmoc’ … compute MOC for Pacific Basin

  • ‘ipmoc’ … compute MOC for Indo-Pacific Basin (PMOC how it should be)

  • ‘imoc’ … compute MOC for Indian-Ocean Basin

  • shp.Reader(‘path’) … compute MOC based on custom shapefile

Important: Between ‘amoc’ and ‘aamoc’ there is not much difference in variability, but upto 1.5Sv in amplitude. Where ‘aamoc’ is stronger than ‘amoc’. There is no clear rule which one is better, just be sure you are consistent

which_transf:

str (default=’dmoc’) which transformation should be computed options area

  • ‘dmoc’ compute dmoc density transformation

  • ‘srf’ compute density transform. from surface forcing

  • ‘inner’ compute density transform. from interior mixing (dmoc-srf)

do_checkbasin:

bool (default=False) provide plot with regional basin selection

exclude_meditoce:

bool (default=False) exclude mediteranian sea from basin selection

do_bolus:

bool (default=False) load density class divergence from bolus velolcity and add them to the total density class divergence

do_dropvar:

bool (default=true) drop all variables from dataset that are not absolutely needed

do_parallel:

bool (default=False) do computation of binning based MOC in parallel

n_workers:

int (default=10) how many worker (CPUs) are used for the parallelized MOC computation

do_compute:

bool (default=False), do xarray dataset compute() at the end data = data.compute(), creates a new dataobject the original data object seems to persist

do_load:

bool (default=True), do xarray dataset load() at the end data = data.load(), applies all operations to the original dataset

do_persist:

bool (default=False), do xarray dataset persist() at the end data = data.persist(), keeps the dataset as dask array, keeps the chunking

do_info:

bool (defalt=True), print variable info at the end

Returns:

dmoc:

object, returns xarray dataset object with DMOC

data_list = list()

data = tpv.load_dmoc_data(mesh, datapath, std_dens, year=year, which_transf='dmoc', descript=descript,
              do_zcoord=True, do_bolus=True, do_load=False, do_persist=True)


dmoc     = tpv.calc_dmoc(mesh, data, dlat=1.0, which_moc=vname, which_transf='dmoc')

data_list.append( dmoc )

tripyview.sub_dmoc.do_ztransform(data)[source]

–>

tripyview.sub_dmoc.do_ztransform_hydrography(mesh, data)[source]

–>

tripyview.sub_dmoc.do_ztransform_martin(mesh, data)[source]

–>

tripyview.sub_dmoc.do_ztransform_mom6(mesh, data)[source]

–>

tripyview.sub_dmoc.load_dmoc_data(mesh, datapath, std_dens, year=None, which_transf='dmoc', do_tarithm='mean', do_bolus=True, add_bolus=False, add_trend=False, do_wdiap=False, do_dflx=False, do_zcoord=True, do_useZinfo='std_dens_H', do_ndensz=False, descript='', do_compute=False, do_load=True, do_persist=False, do_parallel=False, do_info=True, **kwargs)[source]

–> load data that are neccessary for density moc computation

Parameters:

mesh:

fesom2 tripyview mesh object, with all mesh information

datapath:

str, path that leads to the FESOM2 data

std_dens:

np.array with sigma2 density bins that were used in FESOM2 for the DMOC diagnostic

year:

int, list, np.array, range, default: None, single year or list/array of years whos file should be opened

which_transf:

str (default=’dmoc’) which transformation should be computed options area

  • ‘dmoc’ compute dmoc density transformation

  • ‘srf’ compute density transform. from surface forcing

  • ‘inner’ compute density transform. from interior mixing (dmoc-srf)

do_tarithm:

str (default=’mean’) do time arithmetic on time selection option are: None, ‘None’, ‘mean’, ‘median’, ‘std’, ‘var’, ‘max’ ‘min’, ‘sum’

do_bolus:

bool (default=False) load density class divergence from bolus velolcity and add them to the total density class divergence

add_bolus:

bool (default=False) include density class divergence from bolus velolcity as separate varaible in xarray dataset object

add_trend:

bool (default=False) include density class volume trend as separate varaible in xarray dataset object

do_wdiap:

bool (default=False) load data to be used to only look at diapycnal vertical velocity

do_dflx:

bool (default=False) load data to be used for the computation surface buoyancy forced transformations vertical velocities

do_zcoord:

bool (default=True) do density MOC remapping back into zcoord space

do_useZinfo:

str (default=’std_dens_H’) which data should be used for the zcoord remapping. Options are:

  • ‘std_dens_H’ use mean layerthickness of density classes (best option)

  • ‘hydrography’ use mean sigma2 hydrography to estime z position of density classes (OK),

  • ‘density_dMOC’ use density_dMOC variable to estime z position of density classes (Bad),

  • ‘std_dens_Z’ use mean depth of density classes (very bad)

do_ndensz:

bool (default=False) alreadz compute here the density class z position from the density class layer thickness by cumulatic sumation

descript:

str (default=’’), string to describe dataset is written into variable attributes

do_compute:

bool (default=False), do xarray dataset compute() at the end data = data.compute(), creates a new dataobject the original data object seems to persist

do_load:

bool (default=True), do xarray dataset load() at the end data = data.load(), applies all operations to the original dataset

do_persist:

bool (default=False), do xarray dataset persist() at the end data = data.persist(), keeps the dataset as dask array, keeps the chunking

chunks:

dict(), (default=dict({‘time’:’auto’, …})) dictionary with chunksize of specific dimensions. By default setted to auto but can also be setted to specific value. In my observation it revealed that the loading of data was a factor 2-3 faster with auto-chunking but this might has its limitation for very large meshes

do_info:

bool (defalt=True), print variable info at the end

Returns:

data:

object, returns xarray dataset object with density class informations


tripyview.sub_colormap.categorical_cmap(nc, nsc, cmap='tab10', cmap2='nipy_spectral', continuous=False, light2dark=True)[source]
–> build a categorical colormap, based on the predefined standard matplotlib

colormap e.g. tab10 and split each color into a number subcolors from light to dark

Parameters:

nc:

int, number of colors in colormap

nsc:

int, number of sub-colors in each color

cmap:

str, (default=”tab10”) name of matplotlib colormap

cmap2:

str, (default=”nipy_spectral”) replacement colormap when the number of colors nc exceed the number of colors in the actual colormap

continuous:

bool, (default=False)

light2dark:

bool, (default=True), True: do subcategorigal color interpolation from lighttgrey–>color, False: do subcategorigal color interpolation from color –> darkgrey

Returns:

cmap:

matplotlib.colors.ListedColormap computed based on the input parameters

#___________________________________________________________________________

tripyview.sub_colormap.colormap_c2c(cmin, cmax, cref, cnumb, cname, cstep=None, do_slog=False, do_rescal=None, rescal_ref=None, cmap_arr=None)[source]
–> create reference value centered colormap with distinct colorsteps based

on an rgb matrix

Parameters:

cmin:

float, value of minimum color

cmax:

float, value of maximum color

cref:

float, value of reference color

cnumb:

int, minimum number of colors that should be at least in the colormap

cname:

str, name of colormap —> see colormap definition, this can be extended by everyone. If you have have nice new colormap definition, come back to me and lets add it to the default!

  • blue2red, red2blue

  • dblue2dred, dred2dblue

  • green2orange, orange2green

  • grads

  • rainbow

  • heat

  • jet

  • jetw (white reference color)

  • hsv

  • gnuplot

  • arc

  • wbgyr, rygbw (white, blue, green ,yellow,red)

  • odv

  • odvw (white reference color)

  • wvt

  • wvt1

  • wvt2

  • wvt3

  • seaice

  • precip

  • drought

  • ars

If you put a “_i” behind the color name string, the colormap will be inverted

cstep:

float, (default=None) provide fixed color step instead of computing it based on the covered value range and the minimum number of colors

do_slog:

bool, (default=False) provide colormap for symmetric logarithmic scaling

do_rescal:

np.array, (default=None) provide np.array with non linear colorstep values

rescal_ref:

float, (default=None) provide the reference value in the non-linear colorstep array

cmap_arr:

np.array, (default=None) provide a (nc x 3) rgb value colormap definition array from the outside beside the predefined ones. nc must be an odd number. The index of np.ceil(nc/2) will represent the color of the reference value

Returns:

cmap:

matplotlib.colors.ListedColormap computed based on the input parameters

:clevel:np.array() with chosen clevel values, needed for contours and

colorbar definition

cref:

float, returns used cref value

#___________________________________________________________________________

Indices and tables