Skip to content

Compile bug: redefinition of 'res_error' as different kind of symbol #17678

@hksdpc255

Description

@hksdpc255

Git commit

current master

Operating systems

Linux

GGML backends

CUDA

Problem description & steps to reproduce

compile llama.cpp using toolchains from anaconda

First Bad Commit

ec18edf

Compile command

cmake --build llama.cpp/build --config Release -j 16 --target llama-server

Relevant log output

[1/2] Building CXX object tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o
FAILED: tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o 
/work/miniforge3/envs/cuda13/bin/clang++ -DGGML_USE_CPU -DGGML_USE_CUDA -DGGML_USE_RPC -DLLAMA_USE_HTTPLIB -I/home/builduser/llama.cpp/tools/server -I/home/builduser/llama.cpp/build/tools/server -I/home/builduser/llama.cpp/tools/server/../mtmd -I/home/builduser/llama.cpp -I/home/builduser/llama.cpp/common/. -I/home/builduser/llama.cpp/common/../vendor -I/home/builduser/llama.cpp/src/../include -I/home/builduser/llama.cpp/ggml/src/../include -I/home/builduser/llama.cpp/tools/mtmd/. -fPIC -static-libgcc -static-libstdc++ -static-libgfortran -g0 -O3 -march=native -mtune=native -feliminate-unused-debug-types -pipe -Wall -Wno-unused-command-line-argument -fasynchronous-unwind-tables -Wl,-z,now -Wl,-z,relro -fno-semantic-interposition -fno-fat-lto-objects -fno-trapping-math -Wl,-sort-common -Wl,--enable-new-dtags -ffunction-sections -fvisibility-inlines-hidden -Wl,--enable-new-dtags -flto=full -fuse-ld=lld -O3 -DNDEBUG -flto=thin -fPIE -Wmissing-declarations -Wmissing-noreturn -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -Wunreachable-code-break -Wunreachable-code-return -Wmissing-prototypes -Wextra-semi -pthread -MD -MT tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o -MF tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o.d -o tools/server/CMakeFiles/llama-server.dir/server-models.cpp.o -c /home/builduser/llama.cpp/tools/server/server-models.cpp
/home/builduser/llama.cpp/tools/server/server-models.cpp:590:13: error: redefinition of 'res_error' as different kind of symbol
  590 | static void res_error(std::unique_ptr<server_http_res> & res, const json & error_data) {
      |             ^
/work/miniforge3/envs/cuda13/x86_64-conda-linux-gnu/sysroot/usr/include/resolv.h:71:65: note: previous definition is here
   71 | typedef enum { res_goahead, res_nextns, res_modified, res_done, res_error }
      |                                                                 ^
/home/builduser/llama.cpp/tools/server/server-models.cpp:597:18: error: called object type 'res_sendhookact' is not a function or function pointer
  597 |         res_error(res, format_error_response("model name is missing from the request", ERROR_TYPE_INVALID_REQUEST));
      |         ~~~~~~~~~^
/home/builduser/llama.cpp/tools/server/server-models.cpp:602:18: error: called object type 'res_sendhookact' is not a function or function pointer
  602 |         res_error(res, format_error_response("model not found", ERROR_TYPE_INVALID_REQUEST));
      |         ~~~~~~~~~^
/home/builduser/llama.cpp/tools/server/server-models.cpp:609:22: error: called object type 'res_sendhookact' is not a function or function pointer
  609 |             res_error(res, format_error_response("model is not loaded", ERROR_TYPE_INVALID_REQUEST));
      |             ~~~~~~~~~^
/home/builduser/llama.cpp/tools/server/server-models.cpp:709:22: error: called object type 'res_sendhookact' is not a function or function pointer
  709 |             res_error(res, format_error_response("model is not found", ERROR_TYPE_NOT_FOUND));
      |             ~~~~~~~~~^
/home/builduser/llama.cpp/tools/server/server-models.cpp:713:22: error: called object type 'res_sendhookact' is not a function or function pointer
  713 |             res_error(res, format_error_response("model is already loaded", ERROR_TYPE_INVALID_REQUEST));
      |             ~~~~~~~~~^
/home/builduser/llama.cpp/tools/server/server-models.cpp:771:22: error: called object type 'res_sendhookact' is not a function or function pointer
  771 |             res_error(res, format_error_response("model is not found", ERROR_TYPE_INVALID_REQUEST));
      |             ~~~~~~~~~^
/home/builduser/llama.cpp/tools/server/server-models.cpp:775:22: error: called object type 'res_sendhookact' is not a function or function pointer
  775 |             res_error(res, format_error_response("model is not loaded", ERROR_TYPE_INVALID_REQUEST));
      |             ~~~~~~~~~^
8 errors generated.
ninja: build stopped: subcommand failed.

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't workingregressionA regression introduced in a new build (something that was previously working correctly)

    Type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions