Skip to content

Commit

Permalink
Documentation updates from develop (#4361)
Browse files Browse the repository at this point in the history
* Fix grammar and content in HDF5Examples (#4333)

* Add VDS and SWMR to documentation (#4336)

* Fix dead links cont. (#4349)

Added img/images_to_copy.dox as a temporary solution because doxygen didn't copy
the images used in the examples/*.html files - will investigate more.  This was
necessary for the links to intro_SWMR.html and intro_VDS.html.
  • Loading branch information
lrknox authored Apr 9, 2024
1 parent 7496d58 commit 9d60786
Show file tree
Hide file tree
Showing 16 changed files with 225 additions and 25 deletions.
2 changes: 1 addition & 1 deletion doxygen/aliases
Original file line number Diff line number Diff line change
Expand Up @@ -243,7 +243,7 @@ ALIASES += ref_mdc_in_hdf5="<a href=\"https://portal.hdfgroup.org/display/HDF5/M
ALIASES += ref_mdc_logging="<a href=\"https://portal.hdfgroup.org/display/HDF5/H5F_START_MDC_LOGGING\">Metadata Cache Logging</a>"
ALIASES += ref_news_112="<a href=\"https://portal.hdfgroup.org/documentation/hdf5-docs/release_specifics/new_features_1_12.html\">New Features in HDF5 Release 1.12</a>"
ALIASES += ref_h5ocopy="<a href=\"https://portal.hdfgroup.org/display/HDF5/Copying+Committed+Datatypes+with+H5Ocopy\">Copying Committed Datatypes with H5Ocopy()</a>"
ALIASES += ref_sencode_fmt_change="<a href=\"https://portal.hdfgroup.org/pages/viewpage.action?pageId=58100093&preview=/58100093/58100094/encode_format_RFC.pdf\">RFC H5Secnode() / H5Sdecode() Format Change</a>"
ALIASES += ref_sencode_fmt_change="<a href=\"https://docs.hdfgroup.org/hdf5/rfc/H5Sencode_format.docx.pdf\">RFC H5Sencode() / H5Sdecode() Format Change</a>"
ALIASES += ref_vlen_strings="\Emph{Creating variable-length string datatypes}"
ALIASES += ref_vol_doc="VOL documentation"

Expand Down
16 changes: 15 additions & 1 deletion doxygen/dox/TechnicalNotes.dox
Original file line number Diff line number Diff line change
Expand Up @@ -7,6 +7,8 @@
\li \ref IOFLOW
\li \ref TNMDC
\li \ref MT
\li \ref SWMR
\li \ref VDS
\li \ref VFL

*/
Expand Down Expand Up @@ -45,4 +47,16 @@

\htmlinclude DebuggingHDF5Applications.html

*/
*/

/** \page SWMR Introduction to Single-Writer/Multiple-Reader (SWMR)

\htmlinclude intro_SWMR.html

*/

/** \page VDS Introduction to the Virtual Dataset - VDS

\htmlinclude intro_VDS.html

*/
2 changes: 1 addition & 1 deletion doxygen/dox/ViewTools.dox
Original file line number Diff line number Diff line change
Expand Up @@ -997,7 +997,7 @@ In other words, it is an array of four elements, in which each element is a 3 by

This dataset is much more complex. Also note that subsetting cannot be done on Array datatypes.

See this <a href="https://portal.hdfgroup.org/display/knowledge/H5T_ARRAY+Datatype">FAQ</a> for more information on the Array datatype.
See this <a href="https://docs.hdfgroup.org/hdf5/114/_l_b_datatypes.html">section</a> for more information on the Array datatype.

\subsubsection subsubsecViewToolsViewDtypes_objref Object Reference
An Object Reference is a reference to an entire object (dataset, group, or named datatype).
Expand Down
103 changes: 103 additions & 0 deletions doxygen/examples/intro_SWMR.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,103 @@
<html>
<head>
<title>Introduction to Single-Writer_Multiple-Reader (SWMR)</title>

<h2 id="introduction-to-swmr">Introduction to SWMR</h2>
<p>The Single-Writer / Multiple-Reader (SWMR) feature enables multiple processes to read an HDF5 file while it is being written to (by a single process) without using locks or requiring communication between processes.</p>
<p><img src=tutr-swmr1.png alt="tutr-swmr1.png" width=500>
<p>All communication between processes must be performed via the HDF5 file. The HDF5 file under SWMR access must reside on a system that complies with POSIX write() semantics.</p>
<p>The basic engineering challenge for this to work was to ensure that the readers of an HDF5 file always see a coherent (though possibly not up to date) HDF5 file.</p>
<p>The issue is that when writing data there is information in the metadata cache in addition to the physical file on disk:</p>
<p><img src=tutr-swmr2.png alt="tutr-swmr2.png" width=500>
<p>However, the readers can only see the state contained in the physical file:</p>
<p><img src=tutr-swmr3.png alt="tutr-swmr3.png" width=500>
<p>The SWMR solution implements dependencies on when the metadata can be flushed to the file. This ensures that metadata cache flush operations occur in the proper order, so that there will never be internal file pointers in the physical file that point to invalid (unflushed) file addresses.</p>
<p>A beneficial side effect of using SWMR access is better fault tolerance. It is more difficult to corrupt a file when using SWMR.</p>
<h2 id="documentation">Documentation</h2>
<h3 id="swmr-users-guide"><a href="https://docs.hdfgroup.org/hdf5/tn/HDF5_SWMR_User_Guide.pdf">SWMR User&#39;s Guide</a></h3>
<h3 id="hdf5-library-apis">HDF5 Library APIs</h3>
<ul>
<li><a href="https://docs.hdfgroup.org/hdf5/develop/group___s_w_m_r.html#ga159be34fbe7e4a959589310ef0196dfe">H5F_START_SWMR_WRITE</a> — Enables SWMR writing mode for a file</li>
<li><a href="https://docs.hdfgroup.org/hdf5/develop/group___h5_d_o.html#ga316caac160af15192e0c78228667341e">H5DO_APPEND</a> — Appends data to a dataset along a specified dimension</li>
<li>H5P_SET_OBJECT_FLUSH_CB — Sets a callback function to invoke when an object flush occurs in the file</li>
<li>H5P_GET_OBJECT_FLUSH_CB — Retrieves the object flush property values from the file access property list</li>
<li>H5O_DISABLE_MDC_FLUSHES — Prevents metadata entries for an HDF5 object from being flushed from the metadata cache to storage</li>
<li>H5O_ENABLE_MDC_FLUSHES — Enables flushing of dirty metadata entries from a file’s metadata cache</li>
<li>H5O_ARE_MDC_FLUSHES_DISABLED — Determines if an HDF5 object has had flushes of metadata entries disabled</li>
</ul>
<h3 id="tools">Tools</h3>
<ul>
<li>h5watch — Outputs new records appended to a dataset as the dataset grows</li>
<li>h5format_convert — Converts the layout format version and chunked indexing types of datasets created with HDF5-1.10 so that applications built with HDF5-1.8 can access them</li>
<li>h5clear — Clears superblock status_flags field, removes metadata cache image, prints EOA and EOF, or sets EOA of a file</li>
</ul>
<h3 id="design-documents">Design Documents</h3>
<p>Error while fetching page properties report data:</p>
<h2 id="programming-model">Programming Model</h2>
<p>Please be aware that the SWMR feature requires that an HDF5 file be created with the latest file format. See H5P_SET_LIBVER_BOUNDS for more information.</p>
<p>To use SWMR follow the the general programming model for creating and accessing HDF5 files and objects along with the steps described below.</p>
<h3 id="swmr-writer">SWMR Writer:</h3>
<p>The SWMR writer either opens an existing file and objects or creates them as follows.</p>
<p>Open an existing file:</p>
<p>Call H5Fopen using the H5F_ACC_SWMR_WRITE flag.
Begin writing datasets.
Periodically flush data.
Create a new file:</p>
<p>Call H5Fcreate using the latest file format.
Create groups, datasets and attributes, and then close the attributes.
Call H5F_START_SWMR_WRITE to start SWMR access to the file.
Periodically flush data.</p>
<h4 id="example-code">Example Code:</h4>
<p>Create the file using the latest file format property:</p>
<p>
fapl = H5Pcreate (H5P_FILE_ACCESS);
status = H5Pset_libver_bounds (fapl, H5F_LIBVER_LATEST, H5F_LIBVER_LATEST);
fid = H5Fcreate (filename, H5F_ACC_TRUNC, H5P_DEFAULT, fapl);
[Create objects (files, datasets, ...). Close any attributes and named datatype objects. Groups and datasets may remain open before starting SWMR access to them.]</p>
<p>Start SWMR access to the file:</p>
<p> status = H5Fstart_swmr_write (fid);
Reopen the datasets and start writing, periodically flushing data:</p>
<p> status = H5Dwrite (dset_id, ...);
status = H5Dflush (dset_id);</p>
<h3 id="swmr-reader">SWMR Reader:</h3>
<p>The SWMR reader must continually poll for new data:</p>
<p>Call H5Fopen using the H5F_ACC_SWMR_READ flag.
Poll, checking the size of the dataset to see if there is new data available for reading.
Read new data, if any.</p>
<h4 id="example-code-1">Example Code:</h4>
<p>Open the file using the SWMR read flag:</p>
<p> fid = H5Fopen (filename, H5F_ACC_RDONLY | H5F_ACC_SWMR_READ, H5P_DEFAULT);
Open the dataset and then repeatedly poll the dataset, by getting the dimensions, reading new data, and refreshing:</p>
<p> dset_id = H5Dopen (...);
space_id = H5Dget_space (...);
while (...) {
status = H5Dread (dset_id, ...);
status = H5Drefresh (dset_id);
space_id = H5Dget_space (...);
}</p>
<h2 id="limitations-and-scope">Limitations and Scope</h2>
<p>An HDF5 file under SWMR access must reside on a system that complies with POSIX write() semantics. It is also limited in scope as follows:</p>
<p>The writer process is only allowed to modify raw data of existing datasets by;</p>
<p>Appending data along any unlimited dimension.
Modifying existing data
The following operations are not allowed (and the corresponding HDF5 files will fail):</p>
<p>The writer cannot add new objects to the file.
The writer cannot delete objects in the file.
The writer cannot modify or append data with variable length, string or region reference datatypes.
File space recycling is not allowed. As a result the size of a file modified by a SWMR writer may be larger than a file modified by a non-SWMR writer.</p>
<h2 id="tools-for-working-with-swmr">Tools for Working with SWMR</h2>
<p>Two new tools, h5watch and h5clear, are available for use with SWMR. The other HDF5 utilities have also been modified to recognize SWMR:</p>
<p>The h5watch tool allows a user to monitor the growth of a dataset.
The h5clear tool clears the status flags in the superblock of an HDF5 file.
The rest of the HDF5 tools will exit gracefully but not work with SWMR otherwise.</p>
<h2 id="programming-example">Programming Example</h2>
<p>A good example of using SWMR is included with the HDF5 tests in the source code. You can run it while reading the file it creates. If you then interrupt the application and reader and look at the resulting file, you will see that the file is still valid. Follow these steps:</p>
<p>Download the HDF5-1.10 source code to a local directory on a filesystem (that complies with POSIX write() semantics). Build the software. No special configuration options are needed to use SWMR.</p>
<p>Invoke two command terminal windows. In one window go into the bin/ directory of the built binaries. In the other window go into the test/ directory of the HDF5-1.10 source code that was just built.</p>
<p>In the window in the test/ directory compile and run use_append_chunk.c. The example writes a three dimensional dataset by planes (with chunks of size 1 x 256 x 256).</p>
<p>In the other window (in the bin/ directory) run h5watch on the file created by use_append_chunk.c (use_append_chunk.h5). It should be run while use_append_chunk is executing and you will see valid data displayed with h5watch.</p>
<p>Interrupt use_append_chunk while it is running, and stop h5watch.</p>
<p>Use h5clear to clear the status flags in the superblock of the HDF5 file (use_append_chunk.h5).</p>
<p>View the file with h5dump. You will see that it is a valid file even though the application did not close properly. It will contain data up to the point that it was interrupted.</p>

</body></html>
72 changes: 72 additions & 0 deletions doxygen/examples/intro_VDS.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,72 @@
<html>
<head>
<title>Introduction to the Virtual Dataset - VDS</title>

<p>The HDF5 Virtual Dataset (VDS) feature enables users to access data in a collection of HDF5 files as a single HDF5 dataset and to use the HDF5 APIs to work with that dataset.</p>
<p>For example, your data may be collected into four files:</p>

<p><img src="tutrvds-multimgs.png" alt="tutrvds-multimgs.png" width=750></center>

<p>You can map the datasets in the four files into a single VDS that can be accessed just like any other dataset:</p>

<p><img src="tutrvds-snglimg.png" alt="tutrvds-snglimg.png" width=500></center>

<p>The mapping between a VDS and the HDF5 source datasets is persistent and transparent to an application. If a source file is missing the fill value will be displayed.</p>
<p>See the Virtual (VDS) Documentation for complete details regarding the VDS feature.</p>
<p>The VDS feature was implemented using hyperslab selection (H5S_SELECT_HYPERSLAB). See the tutorial on Reading From or Writing to a Subset of a Dataset for more information on selecting hyperslabs.</p>
<p>Programming Model
To create a Virtual Dataset you simply follow the HDF5 programming model and add a few additional API calls to map the source code datasets to the VDS.</p>
<p>Following are the steps for creating a Virtual Dataset:</p>
<p>Create the source datasets that will comprise the VDS
Create the VDS: ‐ Define a datatype and dataspace (can be unlimited)
‐ Define the dataset creation property list (including fill value)
‐ (Repeat for each source dataset) Map elements from the source dataset to elements of the VDS:
Select elements in the source dataset (source selection)
Select elements in the virtual dataset (destination selection)
Map destination selections to source selections (see Functions for Working with a VDS)</p>
<p>‐ Call H5Dcreate using the properties defined above
Access the VDS as a regular HDF5 dataset
Close the VDS when finished</p>
<p>Functions for Working with a VDS
The H5P_SET_VIRTUAL API sets the mapping between virtual and source datasets. This is a dataset creation property list. Using this API will change the layout of the dataset to H5D_VIRTUAL. As with specifying any dataset creation property list, an instance of the property list is created, modified, passed into the dataset creation call and then closed:</p>
<p> dcpl = H5Pcreate (H5P_DATASET_CREATE);</p>
<p> src_space = H5screate_simple ...
status = H5Sselect_hyperslab (space, ...
status = H5Pset_virtual (dcpl, space, SRC_FILE[i], SRC_DATASET[i], src_space);</p>
<p> dset = H5Dcreate2 (file, DATASET, H5T_NATIVE_INT, space, H5P_DEFAULT, dcpl, H5P_DEFAULT);</p>
<p> status = H5Pclose (dcpl);
There are several other APIs introduced with Virtual Datasets, including query functions. For details see the complete list of HDF5 library APIs that support Virtual Datasets</p>
<p>Limitations
This feature requires HDF5-1.10.
The number of source datasets is unlimited. However, there is a limit on the size of each source dataset.</p>
<p>Programming Examples
Example 1
This example creates three HDF5 files, each with a one-dimensional dataset of 6 elements. The datasets in these files are the source datasets that are then used to create a 4 x 6 Virtual Dataset with a fill value of -1. The first three rows of the VDS are mapped to the data from the three source datasets as shown below:</p>
<p><img src="tutrvds-ex.png" alt="tutrvds-ex.png" width=500></p>
<p>In this example the three source datasets are mapped to the VDS with this code:</p>
<pre><code>src\_space = H5Screate\_simple (RANK1, dims, NULL);
for (i = 0; i &lt; 3; i++) {
start[0] = (hsize\_t)i;
/* Select i-th row in the virtual dataset; selection in the source datasets is the same. */
status = H5Sselect\_hyperslab (space, H5S\_SELECT\_SET, start, NULL, count, block);
status = H5Pset\_virtual (dcpl, space, SRC\_FILE[i], SRC\_DATASET[i], src\_space);
}
</code></pre>
<p>After the VDS is created and closed, it is reopened. The property list is then queried to determine the layout of the dataset and its mappings, and the data in the VDS is read and printed.</p>
<p>This example is in the HDF5 source code and can be obtained from here:</p>
<p>C Example</p>
<p>For details on compiling an HDF5 application: [ Compiling HDF5 Applications ]</p>
<p>Example 2
This example shows how to use a C-style printf statement for specifying multiple source datasets as one virtual dataset. Only one mapping is required. In other words only one H5P_SET_VIRTUAL call is needed to map multiple datasets. It creates a 2-dimensional unlimited VDS. Then it re-opens the file, makes queries, and reads the virtual dataset.</p>
<p>The source datasets are specified as A-0, A-1, A-2, and A-3. These are mapped to the virtual dataset with one call:</p>
<pre><code>status = H5Pset\_virtual (dcpl, vspace, SRCFILE, &quot;/A-%b&quot;, src\_space);
</code></pre>
<p>The %b indicates that the block count of the selection in the dimension should be used.</p>
<p>C Example</p>
<p>For details on compiling an HDF5 application: [ Compiling HDF5 Applications ]</p>
<p>Using h5dump with a VDS
The h5dump utility can be used to view a VDS. The h5dump output for a VDS looks exactly like that for any other dataset. If h5dump cannot find a source dataset then the fill value will be displayed.</p>
<p>You can determine that a dataset is a VDS by looking at its properties with h5dump -p. It will display each source dataset mapping, beginning with Mapping 0. Below is an excerpt of the output of h5dump -p on the vds.h5 file created in Example 1.You can see that the entire source file a.h5 is mapped to the first row of the /VDS dataset:</p>

<p><img src="tutrvds-map.png" alt="tutrvds-map.png" width=650></p>
</body></html>
11 changes: 11 additions & 0 deletions doxygen/img/images_to_copy.dox
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
/** \page HTML_IMGS Images for html files

<img src=tutrvds-map.png>
<img src=tutrvds-ex.png>
<img src=tutr-swmr3.png>
<img src=tutr-swmr2.png>
<img src=tutr-swmr1.png>
<img src=tutrvds-snglimg.png>
<img src=tutrvds-multimgs.png>

*/
Binary file added doxygen/img/tutr-swmr1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added doxygen/img/tutr-swmr2.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added doxygen/img/tutr-swmr3.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added doxygen/img/tutrvds-ex.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added doxygen/img/tutrvds-map.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added doxygen/img/tutrvds-multimgs.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added doxygen/img/tutrvds-snglimg.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
26 changes: 12 additions & 14 deletions release_docs/INSTALL
Original file line number Diff line number Diff line change
Expand Up @@ -16,11 +16,12 @@ CONTENTS
--------
1. Obtaining HDF5
2. Third-party Software Requirements
2.1. Zlib
2.2 Szip (optional)
2.3. MPI and MPI-IO


2.1 zlib
2.2 Szip (optional)
2.3 MPI and MPI-IO
3. HDF5 Source Code and Precompiled Binaries
4. Build and Install HDF5 on Unix and Mac OSX Platforms with Autotools
5. Build and Install HDF5 Libraries and Tools with CMake

*****************************************************************************

Expand All @@ -29,19 +30,19 @@ CONTENTS
https://github.com/HDFGroup/hdf5.

2. Third-party Software Requirements
2.1. Zlib
2.1. zlib
The HDF5 library includes a predefined compression filter that
uses the "deflate" method for chunked datasets. If zlib-1.1.2 or
later is found, HDF5 will use it. Otherwise, HDF5's predefined
compression method will degenerate to a no-op; the compression
compression method will be disabled; the compression
filter will succeed but the data will not be compressed.

2.2. Szip (optional)
The HDF5 library includes a predefined compression filter that
uses the extended-Rice lossless compression algorithm for chunked
datasets.

Building instructions are available with the Szip source code.
Szip source code includes build instructions.

The HDF Group does not distribute separate Szip precompiled libraries,
but the HDF5 pre-built binaries provided on The HDF Group download page
Expand All @@ -65,15 +66,12 @@ CONTENTS

3. HDF5 Source Code and Precompiled Binaries
The HDF Group provides source code and pre-compiled binaries from the
HDF5 github releases page:
HDF5 GitHub releases page:

https://github.com/HDFGroup/hdf5/releases

4. Build and Install HDF5 on Unix and Mac OSX Platforms with autotools
4. Build and Install HDF5 on Unix and Mac OSX Platforms with Autotools
see the release_docs/INSTALL_Autotools.txt file.

5. Build and Install HDF5 Libraries and tools with CMake
5. Build and Install HDF5 Libraries and Tools with CMake
see the release_docs/INSTALL_CMake.txt file.



15 changes: 8 additions & 7 deletions release_docs/RELEASE.txt
Original file line number Diff line number Diff line change
Expand Up @@ -169,15 +169,16 @@ New Features

- Incorporated HDF5 examples repository into HDF5 library.

The HDF5Examples folder is equivalent to the repository hdf5-examples.
As such it can build and test the examples during library build or after
the library is installed. Previously, the hdf5-repository archives were
downloaded for packaging with the library. Now the examples can be built
The HDF5Examples folder is equivalent to the hdf5-examples repository.
This enables building and testing the examples
during the library build process or after the library has been installed.
Previously, the hdf5-examples archives were downloaded
for packaging with the library. Now the examples can be built
and tested without a packaged install of the library.

However to maintain the ability to use the HDF5Examples with an installed
library, it is necessary to translate or synch the option names from those
used by the library to those used by the examples. The typical pattern is:
However, to maintain the ability to use the HDF5Examples with an installed
library, it is necessary to map the option names used by the library
to those used by the examples. The typical pattern is:
<example option> = <library option>
HDF_BUILD_FORTRAN = ${HDF5_BUILD_FORTRAN}

Expand Down
Loading

0 comments on commit 9d60786

Please sign in to comment.