Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
build(deps): bump transformers from 4.45.1 to 4.46.1 (#944)
Bumps [transformers](https://github.com/huggingface/transformers) from 4.45.1 to 4.46.1. <details> <summary>Release notes</summary> <p><em>Sourced from <a href="https://github.com/huggingface/transformers/releases">transformers's releases</a>.</em></p> <blockquote> <h2>Patch release v4.46.1</h2> <h1>Patch release v4.4.61</h1> <p>This is mostly for <code>fx</code> and <code>onnx</code> issues!</p> <p>** Fix regression loading dtype <a href="https://redirect.github.com/huggingface/transformers/issues/34409">#34409</a> by <a href="https://github.com/SunMarc"><code>@SunMarc</code></a> ** LLaVa: latency issues <a href="https://redirect.github.com/huggingface/transformers/issues/34460">#34460</a> by <a href="https://github.com/zucchini-nlp"><code>@zucchini-nlp</code></a> ** Fix pix2struct <a href="https://redirect.github.com/huggingface/transformers/issues/34374">#34374</a> by <a href="https://github.com/IlyasMoutawwakil"><code>@IlyasMoutawwakil</code></a> ** Fix onnx non-exposable inplace aten op <a href="https://redirect.github.com/huggingface/transformers/issues/34376">#34376</a> by <a href="https://github.com/IlyasMoutawwakil"><code>@IlyasMoutawwakil</code></a> ** Fix torch.fx issue related to the new <code>loss_kwargs</code> keyword argument <a href="https://redirect.github.com/huggingface/transformers/issues/34380">#34380</a> by <a href="https://github.com/michaelbenayoun"><code>@michaelbenayoun</code></a></p> <h2>Release v4.46.0</h2> <h2>New model additions</h2> <h3>Moshi</h3> <p>The Moshi model was proposed in Moshi: a speech-text foundation model for real-time dialogue by Alexandre Défossez, Laurent Mazaré, Manu Orsini, Amélie Royer, Patrick Pérez, Hervé Jégou, Edouard Grave and Neil Zeghidour.</p> <p>Moshi is a speech-text foundation model that casts spoken dialogue as speech-to-speech generation. Starting from a text language model backbone, Moshi generates speech as tokens from the residual quantizer of a neural audio codec, while modeling separately its own speech and that of the user into parallel streams. This allows for the removal of explicit speaker turns, and the modeling of arbitrary conversational dynamics. Moshi also predicts time-aligned text tokens as a prefix to audio tokens. This “Inner Monologue” method significantly improves the linguistic quality of generated speech and provides streaming speech recognition and text-to-speech. As a result, Moshi is the first real-time full-duplex spoken large language model, with a theoretical latency of 160ms, 200ms in practice.</p> <p><img src="https://github.com/user-attachments/assets/00ed5bcc-47b2-4b73-a8f1-2aa0a2e12b32" alt="image" /></p> <ul> <li>Moshi integration by <a href="https://github.com/ylacombe"><code>@ylacombe</code></a> in <a href="https://redirect.github.com/huggingface/transformers/issues/33624">#33624</a></li> </ul> <h3>Zamba</h3> <p>Zamba-7B-v1 is a hybrid between state-space models (Specifically Mamba) and transformer, and was trained using next-token prediction. Zamba uses a shared transformer layer after every 6 mamba blocks. It uses the Mistral v0.1 tokenizer. We came to this architecture after a series of ablations at small scales. Zamba-7B-v1 was pre-trained on 1T tokens of text and code data.</p> <!-- raw HTML omitted --> <ul> <li>Add Zamba by <a href="https://github.com/pglorio"><code>@pglorio</code></a> in <a href="https://redirect.github.com/huggingface/transformers/issues/30950">#30950</a></li> </ul> <h3>GLM</h3> <p>The GLM Model was proposed in ChatGLM: A Family of Large Language Models from GLM-130B to GLM-4 All Tools by GLM Team, THUDM & ZhipuAI.</p> <p>The abstract from the paper starts with the following:</p> <!-- raw HTML omitted --> </blockquote> <p>... (truncated)</p> </details> <details> <summary>Commits</summary> <ul> <li><a href="https://github.com/huggingface/transformers/commit/bc598c00db37d1fbb1551723873d37e238c3ede7"><code>bc598c0</code></a> v4.41.1</li> <li><a href="https://github.com/huggingface/transformers/commit/94ed13c1de2b98da59a3721ec41dba42692a7984"><code>94ed13c</code></a> Fix regression loading dtype (<a href="https://redirect.github.com/huggingface/transformers/issues/34409">#34409</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/72c716de9274fce07924cb7780dc6392fc276923"><code>72c716d</code></a> LLaVA: latency issues (<a href="https://redirect.github.com/huggingface/transformers/issues/34460">#34460</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/97bb9299c4f179384d470b6909d2259368011be1"><code>97bb929</code></a> Fix pix2struct (<a href="https://redirect.github.com/huggingface/transformers/issues/34374">#34374</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/565f0e97c2941eb9a20d8f8c33a09d9787e8caee"><code>565f0e9</code></a> Fix onnx non-expotable inplace aten op (<a href="https://redirect.github.com/huggingface/transformers/issues/34376">#34376</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/dcfe3c7e618cc628cdd860a4c9b36d8f155266fe"><code>dcfe3c7</code></a> Fix <code>torch.fx</code> issue related to the new <code>loss_kwargs</code> keyword argument (<a href="https://redirect.github.com/huggingface/transformers/issues/34380">#34380</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/c2820c94916e34baf4486accae74760972183a2f"><code>c2820c9</code></a> fix list 3.8</li> <li><a href="https://github.com/huggingface/transformers/commit/b2981611463690f296282ea360ec0d9e63485dff"><code>b298161</code></a> v4.46.0</li> <li><a href="https://github.com/huggingface/transformers/commit/b0f0c61899019d316db17a493023828aa44db06d"><code>b0f0c61</code></a> Add SynthID (watermerking by Google DeepMind) (<a href="https://redirect.github.com/huggingface/transformers/issues/34350">#34350</a>)</li> <li><a href="https://github.com/huggingface/transformers/commit/e50bf61decf741c6d59e4ba633b7392712673bda"><code>e50bf61</code></a> Fix red CI: benchmark script (<a href="https://redirect.github.com/huggingface/transformers/issues/34351">#34351</a>)</li> <li>Additional commits viewable in <a href="https://github.com/huggingface/transformers/compare/v4.45.1...v4.46.1">compare view</a></li> </ul> </details> <br /> [![Dependabot compatibility score](https://dependabot-badges.githubapp.com/badges/compatibility_score?dependency-name=transformers&package-manager=pip&previous-version=4.45.1&new-version=4.46.1)](https://docs.github.com/en/github/managing-security-vulnerabilities/about-dependabot-security-updates#about-compatibility-scores) Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting `@dependabot rebase`. [//]: # (dependabot-automerge-start) [//]: # (dependabot-automerge-end) --- <details> <summary>Dependabot commands and options</summary> <br /> You can trigger Dependabot actions by commenting on this PR: - `@dependabot rebase` will rebase this PR - `@dependabot recreate` will recreate this PR, overwriting any edits that have been made to it - `@dependabot merge` will merge this PR after your CI passes on it - `@dependabot squash and merge` will squash and merge this PR after your CI passes on it - `@dependabot cancel merge` will cancel a previously requested merge and block automerging - `@dependabot reopen` will reopen this PR if it is closed - `@dependabot close` will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually - `@dependabot show <dependency name> ignore conditions` will show all of the ignore conditions of the specified dependency - `@dependabot ignore this major version` will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this minor version` will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself) - `@dependabot ignore this dependency` will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself) </details> Signed-off-by: dependabot[bot] <[email protected]> Co-authored-by: dependabot[bot] <49699333+dependabot[bot]@users.noreply.github.com>
- Loading branch information