Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

espeak-ng-data make build fails #848

Closed
iamashwin99 opened this issue Jul 31, 2023 · 2 comments
Closed

espeak-ng-data make build fails #848

iamashwin99 opened this issue Jul 31, 2023 · 2 comments
Assignees
Labels
bug Something isn't working

Comments

@iamashwin99
Copy link

LocalAI version:
Docker v1.23.0-cublas-cuda11
Environment, CPU architecture, OS, and Version:
Linux hostname 5.10.0-22-amd64 #1 SMP Debian 5.10.178-3 (2023-04-22) x86_64 GNU/Linux
NVIDIA-SMI 470.182.03 Driver Version: 470.182.03 CUDA Version: 11.4
Describe the bug
Unable to build the docker image ( even older version tried until 1.18 )
Here are last few lines of the issue:

cd build && cp -rf CMakeFiles/llama.dir/llama.cpp.o ../llama.cpp/llama.o
cd build && cp -rf examples/CMakeFiles/common.dir/common.cpp.o ../llama.cpp/common.o
cd build && cp -rf examples/CMakeFiles/common.dir/grammar-parser.cpp.o ../llama.cpp/grammar-parser.o
g++ -I./llama.cpp -I. -I./llama.cpp/examples -I./examples -O3 -DNDEBUG -std=c++11 -fPIC -Wall -Wextra -Wpedantic -Wcast-qual -Wno-unused-function -pthread -I./llama.cpp -I./llama.cpp/examples binding.cpp -o binding.o -c 
binding.cpp: In function 'int llama_predict(void*, void*, char*, bool)':
binding.cpp:533:42: warning: cast from type 'const char*' to type 'char*' casts away qualifiers [-Wcast-qual]
  533 |             if (!tokenCallback(state_pr, (char*)token_str)) {
      |                                          ^~~~~~~~~~~~~~~~
binding.cpp:591:1: warning: label 'end' defined but not used [-Wunused-label]
  591 | end:
      | ^~~
binding.cpp: In function 'void llama_binding_free_model(void*)':
binding.cpp:613:5: warning: possible problem detected in invocation of 'operator delete' [-Wdelete-incomplete]
  613 |     delete ctx->model;
      |     ^~~~~~~~~~~~~~~~~
binding.cpp:613:17: warning: invalid use of incomplete type 'struct llama_model'
  613 |     delete ctx->model;
      |            ~~~~~^~~~~
In file included from ./llama.cpp/examples/common.h:5,
                 from binding.cpp:1:
./llama.cpp/llama.h:66:12: note: forward declaration of 'struct llama_model'
   66 |     struct llama_model;
      |            ^~~~~~~~~~~
binding.cpp:613:5: note: neither the destructor nor the class-specific 'operator delete' will be called, even if they are declared when the class is defined
  613 |     delete ctx->model;
      |     ^~~~~~~~~~~~~~~~~
cd build && cp -rf CMakeFiles/ggml.dir/k_quants.c.o ../llama.cpp/k_quants.o
ar src libbinding.a llama.cpp/ggml.o llama.cpp/k_quants.o  llama.cpp/common.o llama.cpp/grammar-parser.o llama.cpp/llama.o binding.o
make[1]: Leaving directory '/build/go-llama'
CGO_LDFLAGS="" C_INCLUDE_PATH=/build/go-llama LIBRARY_PATH=/build/go-llama \
go build -ldflags "-X "github.com/go-skynet/LocalAI/internal.Version=v1.23.0-15-gd603a9c" -X "github.com/go-skynet/LocalAI/internal.Commit=d603a9cbb5910eb69cd0bb7458ab40dcbdcf88cd"" -tags "stablediffusion tts" -o backend-assets/grpc/llama ./cmd/grpc/llama/
# github.com/go-skynet/go-llama.cpp
binding.cpp: In function 'void llama_binding_free_model(void*)':
binding.cpp:613:5: warning: possible problem detected in invocation of 'operator delete' [-Wdelete-incomplete]
  613 |     delete ctx->model;
      |     ^~~~~~~~~~~~~~~~~
binding.cpp:613:17: warning: invalid use of incomplete type 'struct llama_model'
  613 |     delete ctx->model;
      |            ~~~~~^~~~~
In file included from go-llama/llama.cpp/examples/common.h:5,
                 from binding.cpp:1:
go-llama/llama.cpp/llama.h:66:12: note: forward declaration of 'struct llama_model'
   66 |     struct llama_model;
      |            ^~~~~~~~~~~
binding.cpp:613:5: note: neither the destructor nor the class-specific 'operator delete' will be called, even if they are declared when the class is defined
  613 |     delete ctx->model;
      |     ^~~~~~~~~~~~~~~~~
# github.com/go-skynet/LocalAI/pkg/grpc/llm/llama
pkg/grpc/llm/llama/llama.go:32:9: undefined: llama.WithRopeFreqBase
pkg/grpc/llm/llama/llama.go:33:9: undefined: llama.WithRopeFreqScale
pkg/grpc/llm/llama/llama.go:82:24: cannot use opts.Temperature (variable of type float32) as float64 value in argument to llama.SetTemperature
pkg/grpc/llm/llama/llama.go:83:17: cannot use opts.TopP (variable of type float32) as float64 value in argument to llama.SetTopP
pkg/grpc/llm/llama/llama.go:88:25: cannot use ropeFreqBase (variable of type float32) as float64 value in argument to llama.SetRopeFreqBase
pkg/grpc/llm/llama/llama.go:89:26: cannot use ropeFreqScale (variable of type float32) as float64 value in argument to llama.SetRopeFreqScale
pkg/grpc/llm/llama/llama.go:90:32: cannot use opts.NegativePromptScale (variable of type float32) as float64 value in argument to llama.SetNegativePromptScale
pkg/grpc/llm/llama/llama.go:112:64: cannot use opts.MirostatETA (variable of type float32) as float64 value in argument to llama.SetMirostatETA
pkg/grpc/llm/llama/llama.go:116:64: cannot use opts.MirostatTAU (variable of type float32) as float64 value in argument to llama.SetMirostatTAU
pkg/grpc/llm/llama/llama.go:126:60: cannot use opts.PresencePenalty (variable of type float32) as float64 value in argument to llama.SetPenalty
pkg/grpc/llm/llama/llama.go:126:60: too many errors
make: *** [Makefile:350: backend-assets/grpc/llama] Error 1
ERROR: Service 'api' failed to build: The command '/bin/sh -c ESPEAK_DATA=/build/lib/Linux-$(uname -m)/piper_phonemize/lib/espeak-ng-data make build' returned a non-zero code: 2

To Reproduce
Change image to image: quay.io/go-skynet/local-ai:v1.23.0-cublas-cuda11 then docker-compose up -d

@iamashwin99 iamashwin99 added the bug Something isn't working label Jul 31, 2023
@mudler
Copy link
Owner

mudler commented Jul 31, 2023

run make clean after git pull. You are probably using old code checkouts

@iamashwin99
Copy link
Author

That was the issue, it works now thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants