Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

install problem #135

Open
Yusufkulcu opened this issue Sep 29, 2024 · 7 comments
Open

install problem #135

Yusufkulcu opened this issue Sep 29, 2024 · 7 comments

Comments

@Yusufkulcu
Copy link

I bought a new virtual server and I'm trying to install it. but I get the errors in the images. what could be the problem?

I added the error log below

command used: "npm install -g humanifyjs"

image
image

log.txt

@jehna
Copy link
Owner

jehna commented Sep 29, 2024

What's your node --version

@Yusufkulcu
Copy link
Author

What's your node --version

image

@Yusufkulcu
Copy link
Author

What's your node --version

I have v20.10.0 installed on my computer. I installed the same on the virtual server and tried again. First, it gave the error "Python is not installed" and then it gave the errors I added below.

operating system: server 2019

node version : v20.10.0
python version : Python 3.12.6

image

@Yusufkulcu
Copy link
Author

Do you have any solution suggestions?

@0xdevalias
Copy link

0xdevalias commented Sep 30, 2024

Potentially duplicate of/similar root cause as the following issues:

With some notes from there:

If anyone still cannot make it work in Windows then my suggestion is to use WSL, it works with a couple of tweaks (I didn't test with GPU yet)

Originally posted by @neoOpus in #10 (comment)

There is a note about needing node.js 20 in the readme, but it seems to be easy to miss. I wonder which place would be better. Maybe a preinstall hook thst ensures the version?

So, you advise using this in WSL instead of directly on Windows NodeJS ?

Originally posted by @neoOpus in #71 (comment)

Unfortunately I have no idea how Windows works on development, the last time I used Windows as my main dev machine was over 10 years ago 😅

I'd guess both WSL and non-wsl should work fine, as long as you have recent enough version. You can use e.g. nvm to switch between versions on WSL

Originally posted by @jehna in #71 (comment)


log.txt

Skimming through the initial attached log, there seem to be a number of errors/warnings.

This part might not matter (not 100%, just guessing):

npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\isolated-vm\src'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\isolated-vm\\src'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp\\llama\\llama.cpp'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs'
npm warn cleanup     }
npm warn cleanup   ]
npm warn cleanup ]

This is probably a breaking error:

npm error code 1
npm error path C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp
npm error command failed
npm error command C:\Windows\system32\cmd.exe /d /s /c node ./dist/cli/cli.js postinstall

This is probably only needed because no prebuilt binary was found:

npm error ^[[?25h+ Downloading cmake
npm error × Failed to download cmake

Failing to load a prebuilt binary is probably based on a combination of node version and system architecture. Then the 'build from source' might be failing because you're running it in cmd.exe rather than a shell like bash/similar (assuming it's written to be built on a *nix type system), or because you don't have appropriate dev tools installed/similar:


npm error Failed to load a prebuilt binary for platform "win" "x64", falling back to building from source. Error: Error: The specified module could not be found.
npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node
npm error     at Module._extensions..node (node:internal/modules/cjs/loader:1586:18)
npm error     at Module.load (node:internal/modules/cjs/loader:1288:32)
npm error     at Module._load (node:internal/modules/cjs/loader:1104:12)
npm error     at Module.require (node:internal/modules/cjs/loader:1311:19)
npm error     at require (node:internal/modules/helpers:179:18)
npm error     at loadBindingModule (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:390:25)
npm error     at loadExistingLlamaBinary (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:276:37)
npm error     at async getLlamaForOptions (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:133:27)
npm error     at async Object.handler (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/cli/commands/OnPostInstallCommand.js:12:13) {
npm error   code: 'ERR_DLOPEN_FAILED'
npm error }
npm error ^[[?25l^[[?25h'xpm' is not recognized as an internal or external command,
npm error operable program or batch file.
npm error Failed to build llama.cpp with no GPU support. Error: SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1
npm error     at createError (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:34:20)
npm error     at ChildProcess.<anonymous> (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:47:24)
npm error     at ChildProcess.emit (node:events:519:28)
npm error     at cp.emit (C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\cross-spawn\lib\enoent.js:34:29)
npm error     at ChildProcess._handle.onexit (node:internal/child_process:294:12)
npm error SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1
npm error     at createError (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:34:20)
npm error     at ChildProcess.<anonymous> (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/utils/spawnCommand.js:47:24)
npm error     at ChildProcess.emit (node:events:519:28)
npm error     at cp.emit (C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\cross-spawn\lib\enoent.js:34:29)
npm error     at ChildProcess._handle.onexit (node:internal/child_process:294:12)

'xpm' is not recognized as an internal or external command, npm error operable program or batch file.


Failed to load a prebuilt binary for platform "win" "x64", falling back to building from source. Error: Error: The specified module could not be found. npm error \?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node

  • https://github.com/withcatai/node-llama-cpp#installation
    • This package comes with pre-built binaries for macOS, Linux and Windows.

      If binaries are not available for your platform, it'll fallback to download a release of llama.cpp and build it from source with cmake. To disable this behavior, set the environment variable NODE_LLAMA_CPP_SKIP_DOWNLOAD to true.

    • https://node-llama-cpp.withcat.ai/guide/building-from-source
      • INFO

        If cmake is not installed on your machine, node-llama-cpp will automatically download cmake to an internal directory and try to use it to build llama.cpp from source.

        If the build fails, make sure you have the required dependencies of cmake installed on your machine.

We can see the prebuilt dependencies in the node-llama-cpp package.json here:

  "optionalDependencies": {
    "@node-llama-cpp/linux-arm64": "0.1.0",
    "@node-llama-cpp/linux-armv7l": "0.1.0",
    "@node-llama-cpp/linux-x64": "0.1.0",
    "@node-llama-cpp/linux-x64-cuda": "0.1.0",
    "@node-llama-cpp/linux-x64-vulkan": "0.1.0",
    "@node-llama-cpp/mac-arm64-metal": "0.1.0",
    "@node-llama-cpp/mac-x64": "0.1.0",
    "@node-llama-cpp/win-arm64": "0.1.0",
    "@node-llama-cpp/win-x64": "0.1.0",
    "@node-llama-cpp/win-x64-cuda": "0.1.0",
    "@node-llama-cpp/win-x64-vulkan": "0.1.0"
  }

The source for these are here:

Since there is a @node-llama-cpp/win-x64, the Failed to load a prebuilt binary for platform "win" "x64" part of the error is definitely.. interesting.

@Yusufkulcu Does the machine you're installing this on have access to the internet? I'm guessing it tries to dynamically decide which optional binary package it will need based on the system and install that.

We can see that when building the package, it runs npm run addPostinstallScript:

Which we can see is defined in package.json, and basically tells the built package to trigger node ./dist/cli/cli.js postinstall:

"addPostinstallScript": "npm pkg set scripts.postinstall=\"node ./dist/cli/cli.js postinstall\"",

In cli/cli.js we can see that it registers the OnPostInstallCommand:

Which is defined here, and calls getLlamaForOptions:

Which is defined here, and seems to be the part of the package that gets the systems platform/architecture, and tries to determine if it can use prebuilt binaries or not/etc:

Within that, we can see there is a call to loadExistingLlamaBinary:

That calls getPrebuiltBinaryPath, which then calls getPrebuiltBinariesPackageDirectoryForBuildOptions, which is what handles attempting to import the various prebuilt binary packages:

    } else if (buildOptions.platform === "win") {
        if (buildOptions.arch === "x64") {
            if (buildOptions.gpu === "cuda")
                // @ts-ignore
                return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64-cuda"));
            else if (buildOptions.gpu === "vulkan")
                // @ts-ignore
                return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64-vulkan"));
            else if (buildOptions.gpu === false)
                // @ts-ignore
                return getBinariesPathFromModules(() => import("@node-llama-cpp/win-x64"));
        } else if (buildOptions.arch === "arm64")
            // @ts-ignore
            return getBinariesPathFromModules(() => import("@node-llama-cpp/win-arm64"));
    }

This then calls getBinariesPathFromModules which seems to do some stuff to find the bins dir:

Jumping back out to loadExistingLlamaBinary, we can see the 'failed to load a prebuilt binary' message here within a catch block:

The actual triggering error is appended to the end of that message, so the root issue appears to be this part:

Error: Error: The specified module could not be found.
npm error \\?\C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node
npm error     at Module._extensions..node (node:internal/modules/cjs/loader:1586:18)
npm error     at Module.load (node:internal/modules/cjs/loader:1288:32)
npm error     at Module._load (node:internal/modules/cjs/loader:1104:12)
npm error     at Module.require (node:internal/modules/cjs/loader:1311:19)
npm error     at require (node:internal/modules/helpers:179:18)
npm error     at loadBindingModule (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:390:25)
npm error     at loadExistingLlamaBinary (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:276:37)
npm error     at async getLlamaForOptions (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/bindings/getLlama.js:133:27)
npm error     at async Object.handler (file:///C:/Users/Administrator/AppData/Roaming/npm/node_modules/humanifyjs/node_modules/node-llama-cpp/dist/cli/commands/OnPostInstallCommand.js:12:13) {
npm error   code: 'ERR_DLOPEN_FAILED'
npm error }

@Yusufkulcu So I guess the first thing I would be doing is checking whether that file actually exists, and if there is anything that might be blocking it from being able to be loaded (permissions, antivirus, etc):

  • C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node

If that doesn't help, you may be able to get some more specific ideas/support on the GitHub Discussions page for node-llama-cpp:

I took the liberty to open a discussion there based on this issue to potentially expediate getting an official answer:


@Yusufkulcu @jehna It seems that humanify is currently using node-llama-cpp 3.0.0-beta.40:

"node-llama-cpp": "^3.0.0-beta.40",

Whereas it seems to now be up to 3.0.3, so it's possible there have been some relevant bug fixes/improvements released since then:


I have v20.10.0 installed on my computer. I installed the same on the virtual server and tried again. First, it gave the error "Python is not installed" and then it gave the errors I added below.

operating system: server 2019

node version : v20.10.0 python version : Python 3.12.6

@Yusufkulcu The error in your screenshot here is a different error to the one in your initial post. The error you're getting on the virtual server here is related to there not being a prebuilt binary for isolated-vm for that server; which is basically the root cause of this issue:

But solving that won't solve the issues you're having on your main system install.

@0xdevalias
Copy link

0xdevalias commented Sep 30, 2024

This part might not matter (not 100%, just guessing):

npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\isolated-vm\src'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\isolated-vm\\src'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs\\node_modules\\node-llama-cpp\\llama\\llama.cpp'
npm warn cleanup     }
npm warn cleanup   ],
npm warn cleanup   [
npm warn cleanup     'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs',
npm warn cleanup     [Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs'] {
npm warn cleanup       errno: -4048,
npm warn cleanup       code: 'EPERM',
npm warn cleanup       syscall: 'rmdir',
npm warn cleanup       path: 'C:\\Users\\Administrator\\AppData\\Roaming\\npm\\node_modules\\humanifyjs'
npm warn cleanup     }
npm warn cleanup   ]
npm warn cleanup ]

@Yusufkulcu Some further ideas/google results based on Error: EPERM: operation not permitted, rmdir 'C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\node-llama-cpp\llama\llama.cpp'

@0xdevalias
Copy link

Some notes from upstream:


it would be helpful if you can check if C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node exists.

Originally posted by @giladgd in withcatai/node-llama-cpp#353 (comment)


Actually, this may indeed be the permission issue I mentioned.
The attached error shows that node-llama-cpp is installed under the Administrator user, which should never be done, and is very prone to permission issues.

Originally posted by @giladgd in withcatai/node-llama-cpp#353 (reply in thread)


Full comment:

I run some tests on the latest version (3.0.3), and it seems to work as expected on Windows.
I've tested it on both a Windows 10 machine and on a Windows 11 machine.

From the logs you attached, I think there's something wrong in the installation on the user machine, whether it's npm that hasn't installed the prebuilt binaries properly (maybe they're using some npm mirror that doesn't work as expected), maybe it's related to permissions on the machine itself (like running npm install in the folder from one user and trying to use it from another use that doesn't have proper access to the node_modules directory), and maybe it has something to do with some program on the machine the blocks the access to the .node file.

From the linked issue I can see in the logs that it attempted to fallback to building from source, and since CMake is not installed on the user's machine, it tried to call xpm to install CMake locally inside node-llama-cpp, but it failed with this error:

npm error ^[[?25l^[[?25h'xpm' is not recognized as an internal or external command,
npm error operable program or batch file.
npm error Failed to build llama.cpp with no GPU support. Error: SpawnError: Command npm exec --yes -- xpm@^0.16.3 install @xpack-dev-tools/cmake@latest --no-save exited with code 1

The ^[[?25l^[[?25h'xpm' part seems to indicate one of the following:

  • There's and issue with the installation of nodejs/npm on the user machine.

    npm exec --yes is equivalent to npx -y, but npm exec --yes works better with other runtimes such as Bun, which is why I used it.

  • There's an issue with the escaping done by cross-spawn to run the npm exec --yes -- xpm@^0.16.3 command. I used cross-spawn because it solved many issues with process spawning on Windows, so I think this is less probable.

What I would suggest to try to fix this issue would be:

  • Uninstall all global npm modules, including npm, and reinstalling nodejs, since it seems like there's an issue with its installation.
  • Try to use Bun, since it seems they've fixed many Windows compatibility issues.
  • Install CMake manually, so node-llama-cpp can use it.

Before doing all of the above, it would be helpful if you can check if C:\Users\Administrator\AppData\Roaming\npm\node_modules\humanifyjs\node_modules\@node-llama-cpp\win-x64\bins\win-x64\llama-addon.node exists.

Originally posted by @giladgd in withcatai/node-llama-cpp#353 (comment)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants