pkgsrc/devel/hdf5/Makefile

23 lines
623 B
Makefile
Raw Normal View History

hdf5: Update to 1.10.6 New Features ============ Configuration: ------------- - Update CMake for VS2019 support CMake added support for VS2019 in version 3.15. Changes to the CMake generator setting required changes to scripts. Also updated version references in CMake files as necessary. (ADB - 2019/11/18, HDFFV-10962) - Update CMake options to match new autotools options Add configure options (autotools - CMake): enable-asserts HDF5_ENABLE_ASSERTS enable-symbols HDF5_ENABLE_SYMBOLS enable-profiling HDF5_ENABLE_PROFILING enable-optimization HDF5_ENABLE_OPTIMIZATION In addition NDEBUG is no longer forced defined and relies on the CMake process. (ADB - 2019/10/07, HDFFV-100901, HDFFV-10637, TRILAB-97) - Update CMake tests to use FIXTURES CMake test fixtures allow setup/cleanup tests and other dependency requirements as properties for tests. This is more flexible for modern CMake code. (ADB - 2019/07/23, HDFFV-10529) - Windows PDB files are always installed There are build configuration or flag settings for Windows that may not generate PDB files. If those files are not generated then the install utility will fail because those PDB files are not found. An optional variable, DISABLE_PDB_FILES, was added to not install PDB files. (ADB - 2019/07/17, HDFFV-10424) - Add mingw CMake support with a toolchain file There have been a number of mingw issues that have been linked under HDFFV-10845. It has been decided to implement the CMake cross-compiling technique of toolchain files. We will use a linux platform with the mingw compiler stack for testing. Only the C language is fully supported, and the error tests are skipped. The C++ language works for static but shared builds have a shared library issue with the mingw Standard Exception Handling library, which is not available on Windows. Fortran has a common cross-compile problem with the fortran configure tests. (ADB - 2019/07/12, HDFFV-10845, HDFFV-10595) - Windows PDB files are installed incorrectly For static builds, the PDB files for windows should be installed next to the static libraries in the lib folder. Also the debug versions of libraries and PDB files are now correctly built using the default CMAKE_DEBUG_POSTFIX setting. (ADB - 2019/07/09, HDFFV-10581) - Add option to build only shared libs A request was made to prevent building static libraries and only build shared. A new option was added to CMake, ONLY_SHARED_LIBS, which will skip building static libraries. Certain utility functions will build with static libs but are not published. Tests are adjusted to use the correct libraries depending on SHARED/STATIC settings. (ADB - 2019/06/12, HDFFV-10805) - Add options to enable or disable building tools and tests Configure options --enable-tests and --enable-tools were added for autotools configure. These options are enabled by default, and can be disabled with either --disable-tests (or tools) or --enable-tests=no (or --enable-tools=no). Build time is reduced ~20% when tools are disabled, 35% when tests are disabled, 45% when both are disabled. Re-enabling them after the initial build requires running configure again with the option(s) enabled. (LRK - 2019/06/12, HDFFV-9976) - Change tools tests to search the error stack There are some use cases which can cause the error stack of tools to be different then the expected output. These tests now use grepTest.cmake; this was changed to allow the error file to be searched for an expected string. (ADB - 2019/04/15, HDFFV-10741) Library: -------- - Added S3 and HDFS Virtual File Drivers (VFDs) to HDF5 These new VFDs have been introduced in HDF5-1.10.6. Instructions to enable them when configuring HDF5 on Linux and Mac may be found at https://portal.hdfgroup.org/display/HDF5/Virtual+File+Drivers+-+S3+and+HDFS. Installing on Windows requires CMake 3.13 and the following additional setup. Install openssl library (with dev files); from "Shining Light Productions". msi package preferred. PATH should have been updated with the installation dir. set ENV variable OPENSSL_ROOT_DIR to the installation dir. set ENV variable OPENSSL_CONF to the cfg file, likely %OPENSSL_ROOT_DIR%\bin\openssl.cfg Install libcurl library (with dev files); download the latest released version using git: https://github.com/curl/curl.git Open a Visual Studio Command prompt change to the libcurl root folder run the "buildconf.bat" batch file change to the winbuild directory nmake /f Makefile.vc mode=dll MACHINE=x64 copy libcurl-vc-x64-release-dll-ipv6-sspi-winssl dir to C:\curl (installation dir) set ENV variable CURL_ROOT to C:\curl (installation dir) update PATH ENV variable to %CURL_ROOT%\bin (installation bin dir). the aws credentials file should be in %USERPROFILE%\.aws folder set the ENV variable "HDF5_ROS3_TEST_BUCKET_URL=https://s3.us-east-2.amazonaws.com/hdf5ros3" (ADB - 2019/09/12, HDFFV-10854) C++ Library: ------------ - Added new wrappers for H5Pset/get_create_intermediate_group() LinkCreatPropList::setCreateIntermediateGroup() LinkCreatPropList::getCreateIntermediateGroup() (BMR - 2019/04/22, HDFFV-10622) Java Library: ---------------- - Fixed a failure in JUnit-TestH5P on 32-bit architectures (JTH - 2019/04/30) Support for new platforms, languages and compilers. ======================================= - CMake added support for VS2019 in version 3.15. Updated scripts. - macOS 10.13.6 Darwin 17.7.0 with Apple clang LLVM version 10.0.0 - macOS 10.14.6 Darwin 18.7.0 with Apple clang LLVM version 10.0.1 Bug Fixes since HDF5-1.10.5 release ================================== Library ------- - Improved performance when creating a large number of small datasets by retrieving default property values from the API context instead of doing skip list searches. More work is required to achieve parity with HDF5 1.8. (CJH - 2019/12/10, HDFFV-10658) - Fixed user-created data access properties not existing in the property list returned by H5Dget_access_plist. Thanks to Steven Varga for submitting a reproducer and a patch. (CJH - 2019/12/9, HDFFV-10934) - Inappropriate linking with deprecated MPI C++ libraries HDF5 does not define *_SKIP_MPICXX in the public headers, so applications can inadvertently wind up linking to the deprecated MPI C++ wrappers. MPICH_SKIP_MPICXX and OMPI_SKIP_MPICXX have both been defined in H5public.h so this should no longer be an issue. HDF5 makes no use of the deprecated MPI C++ wrappers. (DER - 2019/09/17, HDFFV-10893) - fcntl(2)-based file locking incorrectly passed the lock argument struct instead of a pointer to the struct, causing errors on systems where flock(2) is not available. File locking is used when files are opened to enforce SWMR semantics. A lock operation takes place on all file opens unless the HDF5_USE_FILE_LOCKING environment variable is set to the string "FALSE". flock(2) is preferentially used, with fcntl(2) locks as a backup if flock(2) is unavailable on a system (if neither is available, the lock operation fails). On these systems, the file lock will often fail, which causes HDF5 to not open the file and report an error. This bug only affects POSIX systems. Win32 builds on Windows use a no-op locking call which always succeeds. Systems which exhibit this bug will have H5_HAVE_FCNTL defined but not H5_HAVE_FLOCK in the configure output. This bug affects HDF5 1.10.0 through 1.10.5. fcntl(2)-based file locking now correctly passes the struct pointer. (DER - 2019/08/27, HDFFV-10892) - Fixed a bug caused by a bad tag value when condensing object header messages There was an assertion failure when moving messages from running a user test program with library release HDF5 1.10.4. It was because the tag value (object header's address) was not set up when entering the library routine H5O__chunk_update_idx(), which eventually verifies the metadata tag value when protecting the object header. The problem was fixed by replacing FUNC_ENTER_PACKAGE in H5O__chunk_update_idx() with FUNC_ENTER_PACKAGE_TAG(oh->cache_info.addr) to set up the metadata tag. (VC - 2019/08/23, HDFFV-10873) - Fixed the test failure from test_metadata_read_retry_info() in test/swmr.c The test failure is due to an incorrect number of bins returned for retry info (info.nbins). The # of bins expected for 101 read attempts is 3 instead of 2. The routine H5F_set_retries() in src/H5Fint.c calculates the # of bins by first obtaining the log10 value for (read attempts - 1). For PGI/19, the log10 value for 100 read attempts is 1.9999999999999998 instead of 2.00000. When casting the log10 value to unsigned later on, the decimal part is chopped off causing the test failure. This was fixed by obtaining the rounded integer value (HDceil) for the log10 value of read attempts first before casting the result to unsigned. (VC - 2019/8/14, HDFFV-10813) - Fixed an issue when creating a file with non-default file space info together with library high bound setting to H5F_LIBVER_V18. When setting non-default file space info in fcpl via H5Pset_file_space_strategy() and then creating a file with both high and low library bounds set to H5F_LIBVER_V18 in fapl, the library succeeds in creating the file. File creation should fail because the feature of setting non-default file space info does not exist in library release 1.8 or earlier. This was fixed by setting and checking the proper version in the file space info message based on the library low and high bounds when creating and opening the HDF5 file. (VC - 2019/6/25, HDFFV-10808) - Fixed an issue where copying a version 1.8 dataset between files using H5Ocopy fails due to an incompatible fill version When using the HDF5 1.10.x H5Ocopy() API call to copy a version 1.8 dataset to a file created with both high and low library bounds set to H5F_LIBVER_V18, the H5Ocopy() call will fail with the error stack indicating that the fill value version is out of bounds. This was fixed by changing the fill value message version to H5O_FILL_VERSION_3 (from H5O_FILL_VERSION_2) for H5F_LIBVER_V18. (VC - 2019/6/14, HDFFV-10800) - Fixed a bug that would cause an error or cause fill values to be incorrectly read from a chunked dataset using the "single chunk" index if the data was held in cache and there was no data on disk. (NAF - 2019/03/06) - Fixed a bug that could cause an error or cause fill values to be incorrectly read from a dataset that was written to using H5Dwrite_chunk if the dataset was not closed after writing. (NAF - 2019/03/06, HDFFV-10716) - Fixed memory leak in scale offset filter In a special case where the MinBits is the same as the number of bits in the datatype's precision, the filter's data buffer was not freed, causing the memory usage to grow. In general the buffer was freed correctly. The Minbits are the minimal number of bits to store the data values. Please see the reference manual for H5Pset_scaleoffset for the details. (RL - 2019/3/4, HDFFV-10705) Configuration ------------- - Correct option for default API version CMake options for default API version are not mutually exclusive. Change the multiple BOOL options to a single STRING option with the strings; v16, v18, v110. (ADB - 2019/08/12, HDFFV-10879) Tools ----- - h5repack was fixed to repack datasets with external storage to other types of storage. New test added to repack files and verify the correct data using h5diff. (JS - 2019/09/25, HDFFV-10408) (ADB - 2019/10/02, HDFFV-10918)
2020-01-08 13:30:34 +01:00
# $NetBSD: Makefile,v 1.53 2020/01/08 12:30:34 nia Exp $
Changes 1.6.3: New Features Configuration: - Added some initial support for making purify (or similar memory checking products) happier by initializing buffers to zero and disabling the internal free list code. To take advantage of this, define 'H5_USING_PURIFY' in your CFLAGS when building the library. - WINDOWS building,testing and installing improvements - On Windows, FORTRAN,C++ and C projects are merged into one zip file, users can choose an option to build either FORTRAN or C++ or both with basic C library.For detailed information, please read INSTALL_Windows.txt. - On Windows, szip compression library with or without encoder can be easily turned off or on when building HDF5. For detailed information, please read INSTALL_Windows.txt, especially section V. - On Windows, an optional procedure for building,testing and installing HDF5 from command line is provided. This procedure is supposed to be convenient for experienced users, please read INSTALL_windows_From_Command_Line.txt for details. - On Windows, an alternative short instruction document for building, testing and installing HDF5 is provided. This instruction is supposed to be convenient for general users, please read INSTALL_Windows_Short.txt for details. - On Windows, h5repack,h5diff,h5ls and h5import tool tests have been added. Library: - Modified the way how HDF5 calculates 'pixels_per_scanline' parameter for SZIP compression. Now there is no restriction on the size and shape of the chunk except that the total number of elements in the chunk cannot be bigger than 'pixels_per_block' parameter provided by the user. - HDF5 can now link to SZIP with or without szip's encoder. The new API function H5Zget_filter_info can be used to check szip's status. Attempting to assign szip to a dataset property list or attempting to write with szip will generate an error if szip's encoder is disabled. JL/NF - 2004/6/30 - SZIP always uses K13 compression. This flag no longer needs to be set when calling H5Pset_szip. If the flag for CHIP compression is set, it will be ignored (since the two are mutually exclusive). JL/NF - 2004/6/30 - A new API function H5Fget_name was added. It returns the name of the file by object(file, group, data set, named data type, attribute) ID. SLU - 2004/06/29 - A new API function H5Fget_filesize was added. It returns the actual file size of the opened file. SLU - 2004/06/24 - Added option that if $HDF5_DISABLE_VERSION_CHECK is set to 2, will suppress all library version mismatch warning messages. Tools: - h5repack was added to the tools suite. h5repack regenerates an HDF5 file from another HDF5 file, optionally applying HDF5 filters (compression) and/or chunking to the copied file. The filters options are read from the command line. See /doc/html/Tools.html for more details. PVN - 2004/9/13 - h5dump includes new features: 1) Printing of dataset filters, storage layout and fill value information. 2) Print a list of the file contents. 3) Escape non printing characters. 4) Print the content of the boot block. 5) Print array indices with the data (the default).
2005-02-25 10:10:18 +01:00
COMMENT= Hierarchical Data Format (new generation)
INSTALLATION_DIRS= bin lib include
INSTALLATION_DIRS+= share/doc/hdf5
INSTALLATION_DIRS+= share/examples/hdf5/c
post-install:
cd ${WRKSRC} && ${INSTALL_DATA} COPYING \
release_docs/RELEASE.txt \
release_docs/HISTORY-1_10.txt \
release_docs/HISTORY-1_8_0-1_10_0.txt \
release_docs/HISTORY-1_8.txt \
release_docs/HISTORY-1_0-1_8_0_rc3.txt \
${DESTDIR}${PREFIX}/share/doc/hdf5/
.include "Makefile.common"
.include "options.mk"
.include "../../devel/zlib/buildlink3.mk"
.include "../../mk/bsd.pkg.mk"