-
Notifications
You must be signed in to change notification settings - Fork 766
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
intel-llvm-sycl/dpcpp compiler linking error: soft-float modules with double-float modules #17726
Comments
I've tried the advice outlined in this thread https://community.intel.com/t5/Intel-Fortran-Compiler/Double-precison-on-ARC-GPU/td-p/1450217with the understanding that those environment flags are for the fortran compiler. The following compilation command:
yields
are there comparable environment variables for the Intel sycl compiler? |
Wanted to note this reference link: https://github.com/intel/compute-runtime/blob/master/opencl/doc/FAQ.md#feature-double-precision-emulation-fp64 |
Hi @ct-clmsn, thanks for the report. I don't think that this is anyhow related to
Whilst we are trying to reproduce this on our end, could you please repeat the failing command with |
Could you please also be a little bit more specific about your environment: what specific OS and CPU you are running on? |
I'm currently running on RISC-V with Ubuntu 24.04.1 LTS; I've been working on porting the environment. |
compilation command with
output: clang version 19.0.0git |
Describe the bug
I've compiled the intel-llvm-sycl/dpcpp compiler from GitHub here specifically to use the compiler with an Intel GPU: https://github.com/intel/llvm
Right now, I'd like to compile a simple "hello world" program that can print out an the Intel Arc A750 GPU that I've installed on the machine.
I've compiled this program successfully using the Adaptive C++ compiler from here: https://adaptivecpp.github.io/
When using the Intel-LLVM-Sycl/dpcpp compiler with the following flags:
clang++ -DONEDPL_USE_DPCPP_BACKEND=1 -DONEDPL_USE_OPENMP_BACKEND=0 -DONEDPL_USE_TBB_BACKEND=0 -DPSTL_USE_PARALLEL_POLICIES=0 -D_GLIBCXX_USE_TBB_PAR_BACKEND=0 -O3 -DNDEBUG -std=gnu++20 -fsycl -fsycl-unnamed-lambda -fsycl-targets=spir64 -Xsycl-target-backend "-device intel_gpu_dg2_g12" hellosycl.cpp -o hello_dg12
clang++ -DONEDPL_USE_DPCPP_BACKEND=1 -DONEDPL_USE_OPENMP_BACKEND=0 -DONEDPL_USE_TBB_BACKEND=0 -DPSTL_USE_PARALLEL_POLICIES=0 -D_GLIBCXX_USE_TBB_PAR_BACKEND=0 -O3 -DNDEBUG -std=gnu++20 -fsycl -fsycl-unnamed-lambda -fsycl-targets=spir64 hellosycl.cpp -o hello_dg12
clang++ -DONEDPL_USE_DPCPP_BACKEND=1 -DONEDPL_USE_OPENMP_BACKEND=0 -DONEDPL_USE_TBB_BACKEND=0 -DPSTL_USE_PARALLEL_POLICIES=0 -D_GLIBCXX_USE_TBB_PAR_BACKEND=0 -O3 -DNDEBUG -std=gnu++20 -fsycl -fsycl-unnamed-lambda -fsycl-targets=spir64 hellosycl.cpp -o hello_dg12
clang++ -DONEDPL_USE_DPCPP_BACKEND=1 -DONEDPL_USE_OPENMP_BACKEND=0 -DONEDPL_USE_TBB_BACKEND=0 -DPSTL_USE_PARALLEL_POLICIES=0 -D_GLIBCXX_USE_TBB_PAR_BACKEND=0 -O3 -DNDEBUG -std=gnu++20 -fsycl -fsycl-targets=spir64 hellosycl.cpp -o hello_dg12
The compilation process terminates with this linking error :
/usr/bin/ld: /tmp/hello_dg12-wrapper-f79096.o: can't link soft-float modules with double-float modules
Any assistance or help would be much appreciated! I get the impression there's a flag that's missing which tells the compiler to emulate 64 bit floating point (double precision) on the GPU architecture.
I also wanted to note that the
-Xsycl-targets=spir64_gen
flag causes the compiler to fail and terminate with:clang++: error: gen compiler command failed with exit code 1 (use -v to see invocation)
To reproduce
#include <sycl/sycl.hpp> int main(int argc, char* argv[]) { sycl::queue q; std::cout << "Running on " << q.get_device().get_info<sycl::info::device::name>() << "\n"; }
clang++ -DONEDPL_USE_DPCPP_BACKEND=1 -DONEDPL_USE_OPENMP_BACKEND=0 -DONEDPL_USE_TBB_BACKEND=0 -DPSTL_USE_PARALLEL_POLICIES=0 -D_GLIBCXX_USE_TBB_PAR_BACKEND=0 -O3 -DNDEBUG -std=gnu++20 -fsycl -fsycl-unnamed-lambda -fsycl-targets=spir64_gen hellosycl.cpp -o hello_dg12
Environment
Additional context
No response
The text was updated successfully, but these errors were encountered: