JULES can now run multiple points in parallel, using multiple cores on the same machine or a cluster of machines. This is accomplished using MPI (Message Passing Interface), a standardised message passing interface. Several implementations of MPI are available, the most commonly used being MPICH2 and OpenMPI.
JULES takes advantage of the parallel I/O features in HDF5 / NetCDF4. These are not enabled by default, and so must be explicitly enabled when HDF5 / NetCDF4 are compiled. More information on how to do this can be found on the NetCDF website.
Information on how to build and run JULES in parallel can be found in the JULES User Guide.
Note that although this development has proven stable during testing, it is still experimental and is considered to be for advanced users only.
From a users point of view, the most important change is that the JULES documentation and coding standards are now provided in two forms - HTML (this is the preferred format) and PDF. The HTML documentation is also available on the web at http://www.jchmr.org/jules/documentation/.
This has been made possible by migrating the documentation from a single massive Word document to the Sphinx documentation generator (with some custom extensions to better support Fortran namelists). Although originally intended to document Python projects, Sphinx’s extensibility has seen it adopted for a wide range of projects. Using Sphinx has several advantages over the previous monolithic Word document:
The JULES repository on PUMA has also been refactored so that configurations, documentation and examples sit in a separate project to the core Fortran code.
An option has been added to prescribe the grid-box mean snow-free albedo to a given input (e.g. observations, climatology). See the new switch l_albedo_obs.
For SW albedos, the albedos of the individual tiles are scaled linearly so that the grid-box mean albedo matches the observations, within limits for each tile. When VIS and NIR albedos are required then the input parameters are scaled and corrected in a similar manner.
The change was included in the Global Land configuration at vn5.0: http://collab.metoffice.gov.uk/trac/GL/ticket/8.