Support attention_bias on LLaMA architecture #6658
build.yml
on: pull_request
Matrix: windows-latest-cmake-cublas
Matrix: windows-latest-cmake
ubuntu-focal-make
1m 17s
ubuntu-latest-cmake
1m 23s
macOS-latest-make
2m 18s
macOS-latest-cmake
3m 48s
macOS-latest-cmake-ios
1m 27s
macOS-latest-cmake-tvos
1m 34s
ios-xcode-build
1m 31s
Matrix: macOS-latest-swift
Matrix: ubuntu-latest-cmake-mpi
Matrix: ubuntu-latest-cmake-sanitizer
release
0s
Annotations
1 error
windows-latest-cmake (avx512, -DLLAMA_NATIVE=OFF -DLLAMA_BUILD_SERVER=ON -DLLAMA_AVX512=ON -DBUIL...
Process completed with exit code 1.
|