-
Notifications
You must be signed in to change notification settings - Fork 32
Installing LUMA
As LUMA is written in C++, you need to make sure you have a C++ compiler and implementation installed. The compiler must be capable of supporting the C++11 standard. We also assume you are building LUMA as a 64-bit application and hence have a 64-bit machine. It is possible to target 32-bit architectures with LUMA but you must make sure that all the dependencies are also built to target a 32-bit machine to avoid run time incompatibilities.
For Windows users, we recommend you install Visual Studio and also CMake to help build LUMA's dependencies. CMake must be run with administrator privileges. At the time of writing, LUMA has been built and tested using Visual Studio 2015. Dependencies have been configured for VS2015 using CMake v3.9.1. Visual Studio and CMake can be obtained from their respective webistes.
To utilise the power of Visual Studio as an IDE you should create a new solution and project and then add the inc
and src
directories of the LUMA repository to the project. LUMA has only been built and tested in 64-bit so we suggest you go to the Configuration Manager
and remove the Win32 (x86) build configurations to avoid later confusion. Finally, as LUMA uses some standard library functions that Microsoft consider "unsafe" you will need to add the _CRT_SECURE_NO_WARNINGS
pre-processor definitions in Properties->C/C++->Preprocessor->Preprocessor Definitions
to avoid receiving compile-time errors.
LUMA and its dependencies require the GNU C/C++ compiler to be installed. LUMA has been built successfully using GCC 4.9.0 on Ubuntu 14.04. Makefiles are supplied with LUMA to allow the use of the GNU Make utility for build management.
Before building LUMA you first need to build the dependencies following this guide. If available you could just install pre-built binaries but to guarantee compatibility we suggest building everything form source.
Once you have the dependencies built, you will need to make a note of the location of the relevant header files and library files. These paths will be required to allow LUMA to include relevant headers and link to the dependencies you have just built.
Assuming you have set up a suitable Visual Studio project for LUMA and added the header and source files as described above you now need to add the paths to the header files for the dependencies as well as the paths to the libraries and indicate which libraries VS needs to load.
Under Project->Properties
, add the following to the Additional Include Directories
value on the C/C++->General
tab:
$(HDF5_INC);
$(MSMPI_INC);
where $(HDF5_INC)
and $(MSMPI_INC)
here are system variables set through the Windows Control Panel which point to your HDF5 and MPI header files respectively.
Under Linker->General
add the following to the Additional Library Directories
value:
$(MSMPI_LIB64);
$(HDF5_LIB);
$(LAPACK_LIB);
where these are again system variables that point to the paths to the built binaries for MPI, HDF5 and LAPACK. Note that for many of these, the name of the environment variable is set to something of your choosing and may be called something different to what is shown here. If in doubt, you can just replace these variable references with the full path.
Finally, add the following to the Additional Dependencies
value on the Linker->Input
tab:
libszip.lib;
libzlib.lib;
libhdf5.lib;
msmpi.lib;
liblapack.lib;
to allow linking against HDF5, MPI and LAPACK libraries.
Note: Debug builds for some dependencies append a _D
to the end of the library name. You may need to set up Release and Debug build configurations separately in Visual Studio using the appropriately named library.
Note2: If you did not build HDF5 with the szip
and zlib
modules set to build as static libraries then you may not have the libszip.lib
and libzlib.lib
to link to. In this case your build will complain that it cannot find them at link-time. Simply excluding these from the linking list should correct this.
The following steps will guide you through how to compile and run LUMA using the GCC compiler assuming you have already built the dependencies using this guide. While it isn't essential to set the HDF5_HOME
environment variable, it makes running LUMA more convenient:
First find the path to where the HDF5 library is installed - for us this is at /usr/lib/x86_64-linux-gnu/hdf5/mpich
and set the HDF5_HOME
environment variable:
export HDF5_HOME=/usr/lib/x86_64-linux-gnu/hdf5/mpich
(this can be added to your bash .profile to save you having to do this every time you open a terminal).
A makfile is provided in the build
directory. Copy this makefile up one level in the directory structure. Use the command make
to run the makefile. Hopefully it will build LUMA successfully and you should have an executable in the root directory called LUMA. To run LUMA enter the command:
mpirun -np NPROCS ./LUMA
where NPROCS is the number of processes you want to run LUMA with (this value must match the number of processes set in the definition file before compiling LUMA). If building in serial, LUMA still needs to link to the HDF5 libraries to compile but LUMA can be run without the need for an MPI environment as ./LUMA
.
You may get the error:
Undefined reference to H5pset_dxpl_mpio
or:
Undefined reference to H5Pset_fapl_mpio
This is because your HDF5 installation was not built using the parallel options enabled. Please see building the dependencies for more information. If you cannot find a binary that was built using this flag, you may have to build from source.
You may also find that parallel builds fail at run time with errors due to a conflict between MPICH and Open-MPI. Please see this issue for more details.
The h5mgm
merge tool is a post-processor that takes the multi-grid HDF output from LUMA and converts this data into a time-series of VTU mesh files readable into Paraview. A guide to building the merge tool is available here.
In order to verify LUMA is installed correctly, download, build and run one of the test cases.
Lattice-Boltzmann @ The University of Manchester (LUMA) -- School of Mech., Aero. & Civil Engineering