Tested on Ubuntu 22.04.4 LTS with Julia v1.10.2
Here we show how to automatically compile ExaFMM with the build.sh
script.
Since we will need to compile the c++ part of ExaFMM, first clone FLOWExaFMM somewhere in your machine:
git clone https://github.com/byuflowlab/FLOWExaFMM
Run the script build.sh
that is under FLOWExaFMM:
cd path/to/your/FLOWExaFMM
sh build.sh
This should have generated the file fmm.so
under src/
, which is a binary library containing ExaFMM.
If build.sh
fails to automatically compile ExaFMM, the following steps will help you debug the source of the error and compile the code manually.
These instruction where tested on Ubuntu 22.04 LTS with Julia v1.8.5 on a Dell 7760 laptop.
CxxWrap
up¶First, we will test that CxxWrap runs correctly in your machine. This package is a Julia wrapper for c++ code.
Start by adding CxxWrap to Julia:
julia> ] add CxxWrap
You might get an error complaining that you don't have CMake installed on your system. If so go ahead and get that set up. In a Linux machine it's done with
sudo apt-get install cmake
After installing CMake, make sure your CxxWrap
package is getting built:
julia> ] test CxxWrap
CxxWrap
¶Test that CxxWrap
is working properly as follows.
First, create a file named hello.cpp
with the following C++ code:
cxx
#include <string>
#include "jlcxx/jlcxx.hpp"
// Test function
std::string greet()
{
return "hello, world";
}
// Exposing the function to Julia
JLCXX_MODULE define_julia_module(jlcxx::Module& mod)
{
mod.method("greet", &greet);
}
In order to compile the code, we need to point the compiler to wherever the CxxWrap include files are. Most likely, they are under the path returned by
import CxxWrap
CxxWrap.prefix_path()
In my case, this is what I get:
import CxxWrap
CxxWrap.prefix_path()
"/home/edoalvar/.julia/artifacts/5209ca23f516fb3391b885eef717e49b4ee0a268"
You will also have to find out where the Julia include files are. This can be done with the following command:
abspath(Sys.BINDIR, Base.INCLUDEDIR, "julia")
"/home/edoalvar/.julia/juliaup/julia-1.10.2+0.x64.linux.gnu/include/julia"
Then, we generate the C++ object (called hello.cpp.o
) with the following command:
JLCXX_H=/home/edoalvar/.julia/artifacts/5209ca23f516fb3391b885eef717e49b4ee0a268/include
JULIA_H=/home/edoalvar/.julia/juliaup/julia-1.10.2+0.x64.linux.gnu/include/julia
# Compile object hello.cpp.o
c++ -DJULIA_ENABLE_THREADING -Dhello_EXPORTS -I$JLCXX_H -I$JULIA_H \
-march=native -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG -fPIC \
-o hello.cpp.o -c hello.cpp
NOTE: Make sure you are using an updated version of
gcc
(gcc --version
must show 7.3 or newer).
NOTE 2:
JLCXX_H
andJULIA_H
can be automatically defined in the command line as followsJLCXX_H=$(julia --print "import CxxWrap; CxxWrap.prefix_path()")
JLCXX_H=${JLCXX_H%\"}; JLCXX_H=${JLCXX_H#"}; JLCXX_H=$JLCXX_H/include
JULIA_H=$(julia --print "abspath(Sys.BINDIR, Base.INCLUDEDIR)")
JULIA_H=${JULIA_H%\"}; JULIA_H=${JULIA_H#"}; JULIA_H=$JULIA_H/julia
In order to convert the object into a shared library, we will have to point the compiler to where both libcxxwrap_julia.so
and libjulia.so
are. We then generate the shared library libhello.so
through the following command:
JLCXX_LIB=${JLCXX_H}/../lib/
JULIA_LIB=${JULIA_H}/../../lib/
# Creates shared library libhello.so
c++ -fPIC -march=native -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG \
-shared -Wl,-soname,libhello.so -o libhello.so hello.cpp.o \
-Wl,-rpath,: -L${JLCXX_LIB} -lcxxwrap_julia -L${JULIA_LIB} -ljulia
In summary,
JULIA_H
must point to the directory that contains julia.h
.JLCXX_H
must point to the directory that contains jlcxx/jlcxx.hpp
.JULIA_LIB
must point to the directory that contains libjulia.so.1
(or whatever version of libjulia you have there).JLCXX_LIB
must point to the directory that contains libcxxwrap_julia.so.0.12.2
(or whatever version of libcxxwrap you found).Now, we test that the C++ code was successfully compiled by importing the libhello
library into Julia and calling its greet()
function. Open the Julia REPL and paste the following lines:
# Load the module and generate the functions
module CppHello
using CxxWrap
@wrapmodule( () -> "./libhello" )
function __init__()
@initcxx
end
end
# Call greet and show the result
@show CppHello.greet()
This should have returned a heart-warming Hello World.
Since we will need to compile the c++ part of ExaFMM, first clone FLOWExaFMM somewhere in your machine:
git clone https://github.com/byuflowlab/FLOWExaFMM
Before compiling ExaFMM
, make sure you have an mpi library in your system for parallel processing. In Ubuntu you can install the development tools of OpenMPI
with the following command:
sudo apt-get install openmpi-bin openmpi-common libopenmpi-dev
Now, go to wherever you cloned FLOWExaFMM. In order to compile FLOWExaFMM wrapped with libcxxwrap
, first we need to find the flags JULIA_H
, JLCXX_H
, JULIA_LIB
, and JLCXX_LIB
inside the build script file build.sh
, and point them to the paths that we determined in the "Hello, World" example (see previous section).
Now go to the root level of the FLOWExaFMM
folder and run the command sh build.sh
. If everything went well, this script will compile and generate a shared library fmm.so
under src/
in the FLOWExaFMM
package.
Now that ExaFMM is compiled, you can add FLOWExaFMM to your Julia environment as a development package pointing directly to wherever you compiled the package:
julia> ] develop path/to/your/flowexafmm/FLOWExaFMM
You can add FLOWVPM to Julia directly from the repo:
julia> ] add https://github.com/byuflowlab/FLOWVPM.jl
For sanity, check that FLOWExaFMM and FLOWVPM are running correctly by running their unit tests:
julia> ] test FLOWExaFMM
julia> ] test FLOWVPM
Some common problems that may come up when compiling and running FLOWVPM on BYU's FSL supercomputer.
libmpi.so.40: cannot open shared object file
¶Importing FLOWExaFMM I'm running into the following problem:
julia> import FLOWExaFMM
ERROR: InitError: could not load library "/fslhome/edoalvar/Codes/FLOWExaFMM/src/fmm"
libmpi.so.40: cannot open shared object file: No such file or directory
It seems like the login node doesn't load the lib folder of openmpi to the system level, so we will have to bundle it up into the shared library manually. This is found under /apps/openmpi/4.1.1/gcc-10.2.0_cuda-11.2.1/lib
. I ended up taking the last command that is run in make
and added that path after the -rpath
flag, then re-run the command manually. This looks as follows:
cd build/3d
rm -f fmm; rm ../../src/fmm.so
mpicxx -ffast-math -funroll-loops -fabi-version=6 -Wfatal-errors -fopenmp -g -O2 -o fmm fmm-fmm.o -L/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include/../lib -lcxxwrap_julia -fPIC -march=native -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG -shared -Wl,-rpath,/apps/openmpi/4.1.1/gcc-10.2.0_cuda-11.2.1/lib: -L/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include/../lib -lcxxwrap_julia -L/apps/julia/1.6.1/gcc-10.2.0/include/julia/../../lib -ljulia
cp fmm ../../src/fmm.so
That should do the trick.
When using FLOWExaFMM in a node for batch work, the code needs to have been compiled inside the node, so you probably want to recompile the code when you launch each batch. Alternatively, DG suggested using the flag -march=broadwell
when compiling for the m9 nodes---this way you only need to compile it once in the login node:
cd build/3d
rm -f fmm; rm ../../src/fmm.so
mpicxx -ffast-math -funroll-loops -fabi-version=6 -Wfatal-errors -fopenmp -g -O2 -o fmm fmm-fmm.o -L/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include/../lib -lcxxwrap_julia -fPIC -march=broadwell -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG -shared -Wl,-rpath,/apps/openmpi/4.1.1/gcc-10.2.0_cuda-11.2.1/lib: -L/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include/../lib -lcxxwrap_julia -L/apps/julia/1.6.1/gcc-10.2.0/include/julia/../../lib -ljulia
cp fmm ../../src/fmm.so
If useful, here is the compilation instruction that worked for me:
mpicxx -DHAVE_CONFIG_H -DJULIA_ENABLE_THREADING -Dhello_EXPORTS -I/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include -I/apps/julia/1.6.1/gcc-10.2.0/include/julia -march=broadwell -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG -fPIC -I. -I.. -DEXAFMM_WITH_OPENMP -msse3 -mavx -mavx2 -DNDEBUG -DEXAFMM_EAGER -ffast-math -funroll-loops -fabi-version=6 -Wfatal-errors -fopenmp -g -O2 -MT fmm-fmm.o -MD -MP -MF .deps/fmm-fmm.Tpo -c -o fmm-fmm.o `test -f 'fmm.cxx' || echo './'`fmm.cxx
mpicxx -ffast-math -funroll-loops -fabi-version=6 -Wfatal-errors -fopenmp -g -O2 -o fmm fmm-fmm.o -L/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include/../lib -lcxxwrap_julia -fPIC -march=native -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG -shared -Wl,-rpath,: -L/fslhome/edoalvar/.julia/artifacts/16e1de4679fb8520a8af4e6831c7c8e9893d18b4/include/../lib -lcxxwrap_julia -L/apps/julia/1.6.1/gcc-10.2.0/include/julia/../../lib -ljulia
A first stab at generating binary files with BinaryBuilder.
https://github.com/byuflowlab/FLOWExaFMM
commit 43c5eecf454f73b828e2536702f8f7d3c6c5889e
libcxxwrap_julia
, libjulia
, OpenMPI
, MPICH_jll
, MicrosoftMPI_jll
, LLVMOpenMP_jll
FLOWExaFMM
2.1.0
v10.2.0
, LLVM v12.0.0
cd FLOWExaFMM
JULIA_H=${WORKSPACE}/destdir/include/julia
JLCXX_H=${WORKSPACE}/destdir/include/jlcxx
JULIA_LIB=${WORKSPACE}/destdir/lib
JLCXX_LIB=${WORKSPACE}/destdir/lib
cp -r deps build && cd build
./configure --prefix=${prefix} --build=${MACHTYPE} --host=${target}
cd 3d
make JULIA_H=$JULIA_H JLCXX_H=$JLCXX_H JULIA_LIB=$JULIA_LIB JLCXX_LIB=$JLCXX_LIB
make install
TO KEEP IN MIND
Use libcxxwrap_julia-v0.8.3+0
libcxxwrap_julia_jll v0.9.1
Use libjulia-v1.6.0+0
instead of libjulia_jll v1.8.0+2
Warning: /tmp/jl_Ju0DSQ/KDG9HsBs/x86_64-linux-gnu-libgfortran5-cxx11/destdir/bin/fmm contains std::string values! This causes incompatibilities across the GCC 4/5 version boundary. To remedy this, you must build a tarball for both GCC 4 and GCC 5. To do this, immediately after your platforms
definition in your build_tarballs.jl
file, add the line:
│
│ platforms = expand_cxxstring_abis(platforms)
└ @ BinaryBuilder.Auditor ~/.julia/packages/BinaryBuilder/CKu9k/src/auditor/compiler_abi.jl:247
* Is it possible that the different MPI binaries are conflicting with each other?
mpicxx -DHAVE_CONFIG_H -DJULIA_ENABLE_THREADING -Dhello_EXPORTS -I/workspace/destdir/include/jlcxx -I/workspace/destdir/include/julia -Wunused-parameter -Wextra -Wreorder -std=gnu++1z -O3 -DNDEBUG -fPIC -I. -I.. -DEXAFMM_WITH_OPENMP -DNDEBUG -DEXAFMM_EAGER -ffast-math -funroll-loops -fabi-version=6 -Wfatal-errors -fopenmp -g -O2 -MT fmm-fmm.o -MD -MP -MF .deps/fmm-fmm.Tpo -c -o fmm-fmm.o `test -f 'fmm.cxx' || echo './'`fmm.cxx