-
Notifications
You must be signed in to change notification settings - Fork 43
WSL/Java 17 is supported? #26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @javierfd I don't have much experience with WSL, but Java 17 is definitely supported. Can you please attach the detailed error message? Also, which library is loaded? There should be some log messages like:
I can maybe have a look over the next few days. |
Same as #25, do you have info on your CPU you can share? |
HI, this is the complete log output: For machine specs: A fatal error has been detected by the Java Runtime Environment: SIGILL (0x4) at pc=0x00007f7a5efb6142, pid=111451, tid=111487 JRE version: OpenJDK Runtime Environment (17.0.8.1+1) (build 17.0.8.1+1-Ubuntu-0ubuntu122.04) No core dump will be written. Core dumps have been disabled. To enable core dumping, try "ulimit -c unlimited" before starting Java again If you would like to submit a bug report, please visit: --------------- S U M M A R Y ------------ Command Line: -XX:TieredStopAtLevel=1 -Djava.library.path=/home/pablo/projects/ai/shared/llama.cpp -Dde.kherud.llama.lib.path=/home/pablo/projects/ai/shared/java-llama.cpp/build/src/main/cpp/llama.cpp com.pablo.ai.llm.poc.aipoc.AipocApplication Host: Intel(R) Core(TM) i5-10310U CPU @ 1.70GHz, 8 cores, 15G, Ubuntu 22.04.2 LTS --------------- T H R E A D --------------- Current thread (0x00007f7ad0a66c70): JavaThread "http-nio-8080-exec-2" daemon [_thread_in_native, id=111487, stack(0x00007f7aacdee000,0x00007f7aaceee000)] Stack: [0x00007f7aacdee000,0x00007f7aaceee000], sp=0x00007f7aaceea060, free space=1008k Java frames: (J=compiled Java code, j=interpreted, Vv=VM code) siginfo: si_signo: 4 (SIGILL), si_code: 2 (ILL_ILLOPN), si_addr: 0x00007f7a5efb6142 Register to memory mapping: RAX=0x00007f7aaceea3f0 is pointing into the stack for thread: 0x00007f7ad0a66c70 Registers: Top of Stack: (sp=0x00007f7aaceea060) Instructions: (pc=0x00007f7a5efb6142) Stack slot to memory mapping: --------------- P R O C E S S --------------- Threads class SMR info: Java Threads: ( => current thread ) Other Threads: Threads with active compile tasks: VM state: not at safepoint (normal execution) VM Mutex/Monitor currently owned by a thread: None Heap address: 0x0000000709400000, size: 3948 MB, Compressed Oops mode: Zero based, Oop shift amount: 3 CDS archive(s) mapped at: [0x00007f7a5f000000-0x00007f7a5fbeb000-0x00007f7a5fbeb000), size 12496896, SharedBaseAddress: 0x00007f7a5f000000, ArchiveRelocationMode: 1. GC Precious Log: Heap: Heap Regions: E=young(eden), S=young(survivor), O=old, HS=humongous(starts), HC=humongous(continues), CS=collection set, F=free, OA=open archive, CA=closed archive, TAMS=top-at-mark-start (previous, next) Card table byte_map: [0x00007f7ad4726000,0x00007f7ad4edc000] _byte_map_base: 0x00007f7ad0edc000 Marking Bits (Prev, Next): (CMBitMap*) 0x00007f7ad007f940, (CMBitMap*) 0x00007f7ad007f900 Polling page: 0x00007f7ad7964000 Metaspace: Usage: Virtual space: Chunk freelists: MaxMetaspaceSize: unlimited
Internal statistics: num_allocs_failed_limit: 3. CodeCache: size=49152Kb used=7338Kb max_used=7698Kb free=41813Kb Compilation events (20 events): GC Heap History (10 events): Dll operation events (11 events): Deoptimization events (10 events): Classes unloaded (0 events): Classes redefined (0 events): Internal exceptions (20 events): VM Operations (20 events): Events (20 events): Dynamic libraries: VM Arguments: [Global flags] Logging: Environment Variables: Active Locale: Signal Handlers: --------------- S Y S T E M --------------- OS: /proc/meminfo: /sys/kernel/mm/transparent_hugepage/enabled: [always] madvise never Process Memory: /proc/sys/kernel/threads-max (system-wide limit on the number of threads): 126268 container (cgroup) information: Hyper-V virtualization detected CPU: total 8 (initial active 8) (4 cores per cpu, 2 threads per core) family 6 model 142 stepping 12 microcode 0xffffffff, cx8, cmov, fxsr, ht, mmx, 3dnowpref, sse, sse2, sse3, ssse3, sse4.1, sse4.2, popcnt, lzcnt, tsc, avx, avx2, aes, erms, clmul, bmi1, bmi2, adx, fma, vzeroupper, clflush, clflushopt, hv Online cpus: 0-7 Memory: 4k page, physical 16168892k(13169356k free), swap 4194304k(4194304k free) vm_info: OpenJDK 64-Bit Server VM (17.0.8.1+1-Ubuntu-0ubuntu122.04) for linux-amd64 JRE (17.0.8.1+1-Ubuntu-0ubuntu122.04), built on Aug 24 2023 23:23:28 by "buildd" with gcc 11.4.0 END. |
Same instruction that caused this issue on my end. Dumping 23142: 62 f1 7f 08 7f 83 88 02 00 00 vmovdqu8 %xmm0,0x288(%rbx) The
Looking at your CPU feature flags provided by the log:
We can see that none of the extensions required by this instruction are supported by your CPU (AVX512VL or AVX512BW). Another clue is this from your log:
This means the So, we need to change our build script so this instruction encoding is not used, which means not running with |
Quick update! Modified the file build-args.cmake and set the LLAMA_NATIVE flag to OFF. Then rebuild and worked fine! Also found i was sending both jvm arg parameters (leaved only -Dde.kherud.llama.lib.path) |
Yep, that builds it without any uncommon extensions like AVX512. Looking at their Code below (since Github doesn't render permalinks outside the linked repo): # instruction set specific
if (LLAMA_NATIVE)
set(INS_ENB OFF)
else()
set(INS_ENB ON)
endif()
option(LLAMA_AVX "llama: enable AVX" ${INS_ENB})
option(LLAMA_AVX2 "llama: enable AVX2" ${INS_ENB})
I have not tested the latest version yet but looking at the commit history of this repo it looks like @kherud added that flag to the Linux builds, are you able to verify that they work for you now? |
Ohhh I see!, lemme retry it with a clean env and will come up with an update |
I'll close this issue for now. Feel free to re-open if you still have problems (you also might have a look at version 3.0). |
Hi,
Im doing a quick test on this environment:
Ran a simple example:
Llama model = new LlamaModel();
model.generate()
Fails on the Llama model loading (loadModel (String, ModelParameters), on:
C [libjllama.so+0x23142] gpt_params::gpt_params()+0x3e2
May be this version of java/ubuntu on wsl is no supported?
The text was updated successfully, but these errors were encountered: