url stringlengths 62 66 | repository_url stringclasses 1
value | labels_url stringlengths 76 80 | comments_url stringlengths 71 75 | events_url stringlengths 69 73 | html_url stringlengths 50 56 | id int64 377M 2.15B | node_id stringlengths 18 32 | number int64 1 29.2k | title stringlengths 1 487 | user dict | labels list | state stringclasses 2
values | locked bool 2
classes | assignee dict | assignees list | comments list | created_at int64 1.54k 1.71k | updated_at int64 1.54k 1.71k | closed_at int64 1.54k 1.71k ⌀ | author_association stringclasses 4
values | active_lock_reason stringclasses 2
values | body stringlengths 0 234k ⌀ | reactions dict | timeline_url stringlengths 71 75 | state_reason stringclasses 3
values | draft bool 2
classes | pull_request dict |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
https://api.github.com/repos/huggingface/transformers/issues/29161 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29161/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29161/comments | https://api.github.com/repos/huggingface/transformers/issues/29161/events | https://github.com/huggingface/transformers/issues/29161 | 2,145,902,969 | I_kwDOCUB6oc5_5-F5 | 29,161 | To enter token in jupyter notebook issue | {
"login": "arda1906",
"id": 157398066,
"node_id": "U_kgDOCWG0Mg",
"avatar_url": "https://avatars.githubusercontent.com/u/157398066?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/arda1906",
"html_url": "https://github.com/arda1906",
"followers_url": "https://api.github.com/users/arda1906/... | [] | open | false | null | [] | [
"Hi @arda1906, thanks for raising an issue!\r\n\r\nWithout more information about the error i.e. what does it mean to \"not work\" and what is the expected behaviour? we won't be able to help you. \r\n\r\nFrom the snippet, it's not entirely clear how the code is being run, but there are two separate commands which... | 1,708 | 1,708 | null | NONE | null | I run this [from huggingface_hub import notebook_login
notebook_login() ] on cell and enter my token. but it doesn't work:( | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29161/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29161/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29160 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29160/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29160/comments | https://api.github.com/repos/huggingface/transformers/issues/29160/events | https://github.com/huggingface/transformers/pull/29160 | 2,145,779,053 | PR_kwDOCUB6oc5neHY8 | 29,160 | [WIP] add Fusion In Decoder model | {
"login": "oh-gnues-iohc",
"id": 79557937,
"node_id": "MDQ6VXNlcjc5NTU3OTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/79557937?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/oh-gnues-iohc",
"html_url": "https://github.com/oh-gnues-iohc",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29160/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29160/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29160",
"html_url": "https://github.com/huggingface/transformers/pull/29160",
"diff_url": "https://github.com/huggingface/transformers/pull/29160.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29160.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29159 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29159/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29159/comments | https://api.github.com/repos/huggingface/transformers/issues/29159/events | https://github.com/huggingface/transformers/issues/29159 | 2,145,650,790 | I_kwDOCUB6oc5_5Ahm | 29,159 | [tokenizer] Inconsistent behavior in slow tokenizer and fast tokenizer | {
"login": "Ki-Seki",
"id": 60967965,
"node_id": "MDQ6VXNlcjYwOTY3OTY1",
"avatar_url": "https://avatars.githubusercontent.com/u/60967965?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Ki-Seki",
"html_url": "https://github.com/Ki-Seki",
"followers_url": "https://api.github.com/users/Ki-Sek... | [
{
"id": 2392046359,
"node_id": "MDU6TGFiZWwyMzkyMDQ2MzU5",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Good%20Second%20Issue",
"name": "Good Second Issue",
"color": "dd935a",
"default": false,
"description": "Issues that are more difficult to do than \"Good First... | open | false | null | [] | [
"Hey! Thanks for opening an issue. \r\nFew things first. You are using a custom / local checkpoint with trust remote code. \r\n\r\nFast is not erroring out when you feed OOV, while slow is and it is indeed inconsistent. Would you like to open a PR for a fix? 🤗 ",
"Yes, I'll try that. Thank you for your reply!"
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | ### System Info
- `transformers` version: 4.35.2
- Platform: Linux-5.4.0-163-generic-x86_64-with-glibc2.10
- Python version: 3.8.18
- Huggingface_hub version: 0.19.4
- Safetensors version: 0.4.1
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.1+cu121 (True)
- ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29159/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29159/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29158 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29158/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29158/comments | https://api.github.com/repos/huggingface/transformers/issues/29158/events | https://github.com/huggingface/transformers/pull/29158 | 2,145,552,337 | PR_kwDOCUB6oc5ndVY6 | 29,158 | [PyTorch/XLA] Fix extra TPU compilations introduced by recent changes | {
"login": "alanwaketan",
"id": 8573935,
"node_id": "MDQ6VXNlcjg1NzM5MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/8573935?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alanwaketan",
"html_url": "https://github.com/alanwaketan",
"followers_url": "https://api.github.com/us... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
This PR tries to fix some extra TPU compilations caused by recent HF changes.
1. PyTorch/XLA doesn't support sdpa yet. So we need to set the default attention implementation to eager.
2. tensor.item() will trigger tpu graph synchronization. We should avoid using it in the training loop.
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29158/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29158/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29158",
"html_url": "https://github.com/huggingface/transformers/pull/29158",
"diff_url": "https://github.com/huggingface/transformers/pull/29158.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29158.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29157 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29157/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29157/comments | https://api.github.com/repos/huggingface/transformers/issues/29157/events | https://github.com/huggingface/transformers/issues/29157 | 2,145,549,903 | I_kwDOCUB6oc5_4n5P | 29,157 | Error while saving with EarlyStoppingCallback | {
"login": "dhruvmullick",
"id": 7004024,
"node_id": "MDQ6VXNlcjcwMDQwMjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/7004024?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dhruvmullick",
"html_url": "https://github.com/dhruvmullick",
"followers_url": "https://api.github.com... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.38.0.dev0
- Platform: Linux-5.15.0-78-generic-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.28.0.dev0
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.2+cu121 (True... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29157/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29157/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29156 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29156/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29156/comments | https://api.github.com/repos/huggingface/transformers/issues/29156/events | https://github.com/huggingface/transformers/pull/29156 | 2,145,522,407 | PR_kwDOCUB6oc5ndO3J | 29,156 | Making extensible | {
"login": "ddevaul",
"id": 71190628,
"node_id": "MDQ6VXNlcjcxMTkwNjI4",
"avatar_url": "https://avatars.githubusercontent.com/u/71190628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ddevaul",
"html_url": "https://github.com/ddevaul",
"followers_url": "https://api.github.com/users/ddevau... | [] | open | false | null | [] | [
"Hi @ddevaul, what is the purpose of this PR? \r\n"
] | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29156/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29156/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29156",
"html_url": "https://github.com/huggingface/transformers/pull/29156",
"diff_url": "https://github.com/huggingface/transformers/pull/29156.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29156.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29155 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29155/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29155/comments | https://api.github.com/repos/huggingface/transformers/issues/29155/events | https://github.com/huggingface/transformers/issues/29155 | 2,145,382,760 | I_kwDOCUB6oc5_3_Fo | 29,155 | PyTest import error | {
"login": "loadams",
"id": 114770087,
"node_id": "U_kgDOBtdApw",
"avatar_url": "https://avatars.githubusercontent.com/u/114770087?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/loadams",
"html_url": "https://github.com/loadams",
"followers_url": "https://api.github.com/users/loadams/foll... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | ### System Info
Current head of transformers shows this issue, when importing functions from pytest, the `import_path` function is not found. Sample error from DeepSpeed's unit tests [here](https://github.com/microsoft/DeepSpeed/actions/runs/7977730884/job/21781270161?pr=5164#step:7:391).
```
______________ ERROR... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29155/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29155/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29154 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29154/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29154/comments | https://api.github.com/repos/huggingface/transformers/issues/29154/events | https://github.com/huggingface/transformers/pull/29154 | 2,145,294,779 | PR_kwDOCUB6oc5nccpR | 29,154 | Update pytest `import_path` location | {
"login": "loadams",
"id": 114770087,
"node_id": "U_kgDOBtdApw",
"avatar_url": "https://avatars.githubusercontent.com/u/114770087?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/loadams",
"html_url": "https://github.com/loadams",
"followers_url": "https://api.github.com/users/loadams/foll... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29154). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
Fixes location of `import_path` from pytest from `_pytest.doctest` to `_pytest.pathlib` when using PyTest 8.0.1+ since it is finally deprecated from being in `_pytest.doctest`. It is provided in `_pytest.pathlib` from at least 7.2.0+ so we do not need to modify the supported pytest range in ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29154/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29154/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29154",
"html_url": "https://github.com/huggingface/transformers/pull/29154",
"diff_url": "https://github.com/huggingface/transformers/pull/29154.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29154.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29153 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29153/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29153/comments | https://api.github.com/repos/huggingface/transformers/issues/29153/events | https://github.com/huggingface/transformers/issues/29153 | 2,145,101,851 | I_kwDOCUB6oc5_26gb | 29,153 | Plans to add DoRA? | {
"login": "RonanKMcGovern",
"id": 78278410,
"node_id": "MDQ6VXNlcjc4Mjc4NDEw",
"avatar_url": "https://avatars.githubusercontent.com/u/78278410?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/RonanKMcGovern",
"html_url": "https://github.com/RonanKMcGovern",
"followers_url": "https://api.gi... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | [
"cc @younesbelkada @pacman100 ",
"Hi @RonanKMcGovern ! \r\nThanks for the feature request! There is already an ongoing work from @BenjaminBossan to add DoRA in PEFT: https://github.com/huggingface/peft/pull/1474",
"Closing as there is a PR underway.",
"OK thank you @RonanKMcGovern !"
] | 1,708 | 1,708 | null | NONE | null | ### Feature request
Improves on LoRA by allowing magnitude fine-tuning.
### Motivation
Improved perplexity.
### Your contribution
Sebastien Bubeck has published demo code. https://github.com/rasbt/dora-from-scratch | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29153/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29153/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29152 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29152/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29152/comments | https://api.github.com/repos/huggingface/transformers/issues/29152/events | https://github.com/huggingface/transformers/pull/29152 | 2,145,071,699 | PR_kwDOCUB6oc5nbr5K | 29,152 | Alternative approach | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"cc @Rocketknight1 ",
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29152). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | COLLABORATOR | null | # What does this PR do?
Alternative way to use stop words for generated sequences. Note - it doesn't
<details>
<summary>Script</summary>
```py
import time
import numpy as np
from transformers.generation.stopping_criteria import StopStringCriteria, StopStringCriteria2
from transformers import AutoToke... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29152/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29152/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29152",
"html_url": "https://github.com/huggingface/transformers/pull/29152",
"diff_url": "https://github.com/huggingface/transformers/pull/29152.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29152.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29151 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29151/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29151/comments | https://api.github.com/repos/huggingface/transformers/issues/29151/events | https://github.com/huggingface/transformers/issues/29151 | 2,145,069,207 | I_kwDOCUB6oc5_2yiX | 29,151 | Static cache + torch.compile: support prefill static sequence length | {
"login": "fxmarty",
"id": 9808326,
"node_id": "MDQ6VXNlcjk4MDgzMjY=",
"avatar_url": "https://avatars.githubusercontent.com/u/9808326?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fxmarty",
"html_url": "https://github.com/fxmarty",
"followers_url": "https://api.github.com/users/fxmarty/... | [] | open | false | null | [] | [
"cc @gante ",
"@fxmarty this is the same problem as we have in TF and Flax. There, we nudged users to use the `pad_to_multiple_of` argument in the tokenizer, which I believe solves the problem 🤗 \r\n\r\nHow do you suggest us to let users know about this feature, other than docs?"
] | 1,708 | 1,708 | null | COLLABORATOR | null | ### Feature request
When using torch.compile, the prefill is recompiled for every new sequence length, which is slow. It may be nice to be able to compile only say for some sequence lengths (`1, 2, 4, 16, 32, 64, 128, 256, 512, 1024, 2048, 4096, etc`) on the fly depending on the input lengths, using some padding.
###... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29151/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29151/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29150 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29150/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29150/comments | https://api.github.com/repos/huggingface/transformers/issues/29150/events | https://github.com/huggingface/transformers/issues/29150 | 2,144,941,834 | I_kwDOCUB6oc5_2TcK | 29,150 | Difficulty in adding custom model | {
"login": "El-chapo-007",
"id": 125077963,
"node_id": "U_kgDOB3SJyw",
"avatar_url": "https://avatars.githubusercontent.com/u/125077963?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/El-chapo-007",
"html_url": "https://github.com/El-chapo-007",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"Hi @El-chapo-007, thanks for opening this issue! \r\n\r\nGlad to hear that your journey has been mostly successful 🤗 \r\n\r\nHave you seen our documentation page about adding custom models? This should contain all the info and example code needed to get started: https://huggingface.co/docs/transformers/custom_mod... | 1,708 | 1,708 | null | NONE | null | ### Feature request
Hi
Hope all the team members of hugging face are well
I am a student and currently doing work on nlp projects , although most of my journey was successful because well documented information for starters especially example notebooks but what part is confusing and difficult is to upload and cr... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29150/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29150/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29149 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29149/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29149/comments | https://api.github.com/repos/huggingface/transformers/issues/29149/events | https://github.com/huggingface/transformers/issues/29149 | 2,144,914,235 | I_kwDOCUB6oc5_2Ms7 | 29,149 | Generate: support passing position_ids | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | open | false | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [
{
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.co... | [
"@zucchini-nlp FYI. We shouldn't fix this now, as it requires significant manual labor to update all models. After the static cache sprint we should have a look at this :)"
] | 1,708 | 1,708 | null | MEMBER | null | Thank you @tengomucho, for uncovering this bug.
### The problem
In a nutshell, passing the correct `position_ids` to `generate` should result in exactly the same results as not passing them. In other words, the following test should pass on all models, if added to `GenerationTesterMixin`. We can see that it is fa... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29149/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29149/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29148 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29148/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29148/comments | https://api.github.com/repos/huggingface/transformers/issues/29148/events | https://github.com/huggingface/transformers/pull/29148 | 2,144,911,415 | PR_kwDOCUB6oc5nbILV | 29,148 | Token level timestamps for long-form generation in Whisper | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29148). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | MEMBER | null | # What does this PR do?
Continuation of PR #28984. Adds token level timestamps for long-form generation. The previous PR had a quite different of way to add timestamps, specifically by calling `extract_timestamps` for each segment and each batch separately. I believe, it can be done in one batch, and then divided in... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29148/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29148/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29148",
"html_url": "https://github.com/huggingface/transformers/pull/29148",
"diff_url": "https://github.com/huggingface/transformers/pull/29148.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29148.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29147 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29147/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29147/comments | https://api.github.com/repos/huggingface/transformers/issues/29147/events | https://github.com/huggingface/transformers/pull/29147 | 2,144,785,389 | PR_kwDOCUB6oc5nasd- | 29,147 | Fix drop path being ignored in DINOv2 | {
"login": "fepegar",
"id": 12688084,
"node_id": "MDQ6VXNlcjEyNjg4MDg0",
"avatar_url": "https://avatars.githubusercontent.com/u/12688084?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fepegar",
"html_url": "https://github.com/fepegar",
"followers_url": "https://api.github.com/users/fepega... | [] | closed | false | null | [] | [
"Thanks for reviewing, @amyeroberts!"
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29147/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29147/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29147",
"html_url": "https://github.com/huggingface/transformers/pull/29147",
"diff_url": "https://github.com/huggingface/transformers/pull/29147.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29147.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29146 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29146/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29146/comments | https://api.github.com/repos/huggingface/transformers/issues/29146/events | https://github.com/huggingface/transformers/pull/29146 | 2,144,586,510 | PR_kwDOCUB6oc5naAbp | 29,146 | Generate: missing generation config eos token setting in encoder-decoder tests | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29146). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | MEMBER | null | # What does this PR do?
These tests were failing with low likelihood, all for the same reason as fixed in [this recent PR](https://github.com/huggingface/transformers/pull/28923): there should be no EOS token to enable endless generation, but the generation config still had the default value.
I couldn't find more... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29146/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29146/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29146",
"html_url": "https://github.com/huggingface/transformers/pull/29146",
"diff_url": "https://github.com/huggingface/transformers/pull/29146.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29146.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29145 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29145/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29145/comments | https://api.github.com/repos/huggingface/transformers/issues/29145/events | https://github.com/huggingface/transformers/issues/29145 | 2,144,556,865 | I_kwDOCUB6oc5_01dB | 29,145 | AI2 Olmo 7B does not support Flash-Attention 2.0. ValueError: OLMoForCausalLM does not support Flash Attention 2.0 yet. | {
"login": "KaifAhmad1",
"id": 98801504,
"node_id": "U_kgDOBeOXYA",
"avatar_url": "https://avatars.githubusercontent.com/u/98801504?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KaifAhmad1",
"html_url": "https://github.com/KaifAhmad1",
"followers_url": "https://api.github.com/users/KaifA... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | NONE | null | ### Model description
Model Name: allenai/OLMo-7B
### Open source status
- [X] The model implementation is available
- [X] The model weights are available
### Provide useful links for the implementation
_No response_ | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29145/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29145/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29144 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29144/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29144/comments | https://api.github.com/repos/huggingface/transformers/issues/29144/events | https://github.com/huggingface/transformers/pull/29144 | 2,144,483,260 | PR_kwDOCUB6oc5nZpun | 29,144 | bug-fix: avoid 'Expected all tensors to be on the same device' error when doing multi-GPU training | {
"login": "kallewoof",
"id": 250224,
"node_id": "MDQ6VXNlcjI1MDIyNA==",
"avatar_url": "https://avatars.githubusercontent.com/u/250224?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kallewoof",
"html_url": "https://github.com/kallewoof",
"followers_url": "https://api.github.com/users/kall... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | When doing DPO training, if the model has been split over multiple GPUs, the `tr_loss` and the `tr_loss_step` end up on different devices at some point, resulting in a
```
Expected all tensors to be on the same device, but found at least two devices, cuda:0 and cuda:1
```
error. This patch makes an explicit copy... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29144/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29144/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29144",
"html_url": "https://github.com/huggingface/transformers/pull/29144",
"diff_url": "https://github.com/huggingface/transformers/pull/29144.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29144.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29143 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29143/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29143/comments | https://api.github.com/repos/huggingface/transformers/issues/29143/events | https://github.com/huggingface/transformers/pull/29143 | 2,144,476,455 | PR_kwDOCUB6oc5nZoPN | 29,143 | Llama: update rope scaling to match static cache changes | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29143). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | MEMBER | null | # What does this PR do?
(see title :))
Review suggestion:
1. Review changes in Llama
2. Review the rest | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29143/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29143/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29143",
"html_url": "https://github.com/huggingface/transformers/pull/29143",
"diff_url": "https://github.com/huggingface/transformers/pull/29143.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29143.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29142 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29142/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29142/comments | https://api.github.com/repos/huggingface/transformers/issues/29142/events | https://github.com/huggingface/transformers/pull/29142 | 2,144,430,707 | PR_kwDOCUB6oc5nZeOR | 29,142 | Add training version check for AQLM quantizer. | {
"login": "BlackSamorez",
"id": 16901341,
"node_id": "MDQ6VXNlcjE2OTAxMzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BlackSamorez",
"html_url": "https://github.com/BlackSamorez",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29142). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29142/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29142/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29142",
"html_url": "https://github.com/huggingface/transformers/pull/29142",
"diff_url": "https://github.com/huggingface/transformers/pull/29142.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29142.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29141 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29141/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29141/comments | https://api.github.com/repos/huggingface/transformers/issues/29141/events | https://github.com/huggingface/transformers/pull/29141 | 2,144,232,619 | PR_kwDOCUB6oc5nYyzq | 29,141 | Save (circleci) cache at the end of a job | {
"login": "ydshieh",
"id": 2521628,
"node_id": "MDQ6VXNlcjI1MjE2Mjg=",
"avatar_url": "https://avatars.githubusercontent.com/u/2521628?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ydshieh",
"html_url": "https://github.com/ydshieh",
"followers_url": "https://api.github.com/users/ydshieh/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29141). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
This way, `pytest` will run before `cache saving` and we have access to the results earlier in the case of partial or no cache loaded. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29141/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29141/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29141",
"html_url": "https://github.com/huggingface/transformers/pull/29141",
"diff_url": "https://github.com/huggingface/transformers/pull/29141.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29141.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29140 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29140/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29140/comments | https://api.github.com/repos/huggingface/transformers/issues/29140/events | https://github.com/huggingface/transformers/issues/29140 | 2,144,160,231 | I_kwDOCUB6oc5_zUnn | 29,140 | Drop path is ignored in DINOv2 | {
"login": "fepegar",
"id": 12688084,
"node_id": "MDQ6VXNlcjEyNjg4MDg0",
"avatar_url": "https://avatars.githubusercontent.com/u/12688084?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fepegar",
"html_url": "https://github.com/fepegar",
"followers_url": "https://api.github.com/users/fepega... | [] | closed | false | null | [] | [
"Hey, thanks for the issue! I've checked out your branch, from what I'm seeing tests are passing on your fix, would you mind opening a PR? \r\nAlso, since this will affect training, do you have a script that compares both in a training scenario? AFAIK current integration tests for Dinov2 are not in a training setti... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | ### System Info
- `transformers` version: 4.38.0.dev0
- Platform: Linux-5.15.0-91-generic-x86_64-with-glibc2.31
- Python version: 3.11.7
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.2.0 (True)
- Ten... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29140/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29140/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29139 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29139/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29139/comments | https://api.github.com/repos/huggingface/transformers/issues/29139/events | https://github.com/huggingface/transformers/issues/29139 | 2,144,132,992 | I_kwDOCUB6oc5_zN-A | 29,139 | past_key_values for SeamlessM4Tv2ForSpeechToText is not working as expected | {
"login": "vapemaster-kz",
"id": 65128133,
"node_id": "MDQ6VXNlcjY1MTI4MTMz",
"avatar_url": "https://avatars.githubusercontent.com/u/65128133?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vapemaster-kz",
"html_url": "https://github.com/vapemaster-kz",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [
"cc @ylacombe "
] | 1,708 | 1,708 | null | NONE | null | ### System Info
transformers version: 4.37.2
python verison: 3.8.6.
OS: Windows 11
### Who can help?
@sanchit-gandhi
### Information
- [ ] The official example scripts
- [X] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [X] My own task... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29139/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29139/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29138 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29138/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29138/comments | https://api.github.com/repos/huggingface/transformers/issues/29138/events | https://github.com/huggingface/transformers/pull/29138 | 2,144,115,768 | PR_kwDOCUB6oc5nYZN3 | 29,138 | Fix ROPE embeddings for LLama | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | MEMBER | null | # What does this PR do?
This [test](https://app.circleci.com/pipelines/github/huggingface/transformers/84847/workflows/2a5e5769-9431-4e2b-babb-81a112558a97/jobs/1098065) failed on my PR and I checked to see the reason. I found that the changes introduced to make llama compile compatible are causing the issue.
Th... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29138/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29138/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29138",
"html_url": "https://github.com/huggingface/transformers/pull/29138",
"diff_url": "https://github.com/huggingface/transformers/pull/29138.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29138.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29137 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29137/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29137/comments | https://api.github.com/repos/huggingface/transformers/issues/29137/events | https://github.com/huggingface/transformers/issues/29137 | 2,144,069,859 | I_kwDOCUB6oc5_y-jj | 29,137 | transformers.AutoTokenizer.from_pretrained( ... use_Fast=False) fails with 'TypeError: not a string' for some tokenizers | {
"login": "Jeronymous",
"id": 22522728,
"node_id": "MDQ6VXNlcjIyNTIyNzI4",
"avatar_url": "https://avatars.githubusercontent.com/u/22522728?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Jeronymous",
"html_url": "https://github.com/Jeronymous",
"followers_url": "https://api.github.com/use... | [] | closed | false | null | [] | [
"cc @ArthurZucker ",
"Hey! Thanks for reporting. \r\n`tokenizer.Load(self.vocab_file)` seems to be the issue here. If you check the repo it does not have the `tokenizer.model` .\r\nYou should raise the issue there! \r\n",
"Thanks @ArthurZucker 👍 "
] | 1,708 | 1,708 | 1,708 | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-5.15.133.1-microsoft-standard-WSL2-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.19.4
- Safetensors version: 0.4.1
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29137/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29137/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29136 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29136/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29136/comments | https://api.github.com/repos/huggingface/transformers/issues/29136/events | https://github.com/huggingface/transformers/pull/29136 | 2,144,048,828 | PR_kwDOCUB6oc5nYKjd | 29,136 | Generate: low memory tests are flaky | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29136). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"@amyeroberts #29109 seems to have fixed most of the issue (this test does compare ... | 1,708 | 1,708 | null | MEMBER | null | # What does this PR do?
As identified by @molbap -- generate tests with the `low_memory` flag are flaky. The full reason is the same as explained in [this comment](https://github.com/huggingface/transformers/issues/25420#issuecomment-1775317535).
The error likelihood has low (~3%), but still quite disruptive for ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29136/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29136/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29136",
"html_url": "https://github.com/huggingface/transformers/pull/29136",
"diff_url": "https://github.com/huggingface/transformers/pull/29136.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29136.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29135 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29135/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29135/comments | https://api.github.com/repos/huggingface/transformers/issues/29135/events | https://github.com/huggingface/transformers/pull/29135 | 2,144,037,386 | PR_kwDOCUB6oc5nYICS | 29,135 | Revert low cpu mem tie weights | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29135). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Sounds good, thanks for taking care of this!"
] | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
Reverts #28948 and #29043
See relevant comment: https://github.com/huggingface/transformers/pull/29110#issuecomment-1953847826
cc @hackyon @ydshieh
| {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29135/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29135/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29135",
"html_url": "https://github.com/huggingface/transformers/pull/29135",
"diff_url": "https://github.com/huggingface/transformers/pull/29135.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29135.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29134 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29134/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29134/comments | https://api.github.com/repos/huggingface/transformers/issues/29134/events | https://github.com/huggingface/transformers/pull/29134 | 2,143,960,967 | PR_kwDOCUB6oc5nX3V4 | 29,134 | Add generate kwargs to VQA pipeline | {
"login": "regisss",
"id": 15324346,
"node_id": "MDQ6VXNlcjE1MzI0MzQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/15324346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/regisss",
"html_url": "https://github.com/regisss",
"followers_url": "https://api.github.com/users/regiss... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29134). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29134/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29134/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29134",
"html_url": "https://github.com/huggingface/transformers/pull/29134",
"diff_url": "https://github.com/huggingface/transformers/pull/29134.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29134.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29133 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29133/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29133/comments | https://api.github.com/repos/huggingface/transformers/issues/29133/events | https://github.com/huggingface/transformers/pull/29133 | 2,143,951,741 | PR_kwDOCUB6oc5nX1Va | 29,133 | [`cuda kernels`] only compile them when initializing | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29133). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"I'll make sure of that before merging! Testing now!",
"```bash\r\nFAILED tests/m... | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
Fixes #29130, from 1min to 6seconds | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29133/reactions",
"total_count": 3,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 3,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29133/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29133",
"html_url": "https://github.com/huggingface/transformers/pull/29133",
"diff_url": "https://github.com/huggingface/transformers/pull/29133.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29133.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29132 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29132/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29132/comments | https://api.github.com/repos/huggingface/transformers/issues/29132/events | https://github.com/huggingface/transformers/issues/29132 | 2,143,872,350 | I_kwDOCUB6oc5_yOVe | 29,132 | SPAM | {
"login": "cook9019",
"id": 141466977,
"node_id": "U_kgDOCG6dYQ",
"avatar_url": "https://avatars.githubusercontent.com/u/141466977?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/cook9019",
"html_url": "https://github.com/cook9019",
"followers_url": "https://api.github.com/users/cook9019/... | [] | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | NONE | null | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29132/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29132/timeline | not_planned | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29131 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29131/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29131/comments | https://api.github.com/repos/huggingface/transformers/issues/29131/events | https://github.com/huggingface/transformers/pull/29131 | 2,143,812,725 | PR_kwDOCUB6oc5nXWfA | 29,131 | added the max_matching_ngram_size to GenerationConfig | {
"login": "mosheber",
"id": 22236370,
"node_id": "MDQ6VXNlcjIyMjM2Mzcw",
"avatar_url": "https://avatars.githubusercontent.com/u/22236370?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/mosheber",
"html_url": "https://github.com/mosheber",
"followers_url": "https://api.github.com/users/mos... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
* Added the max_matching_ngram_size parameter into the GenerationConfig, for the PromptLookupCandidateGenerator.
* Included the max_matching_ngram_size when calling the __init__ of PromptLookupCandidateGenerator in _get_candidate_generator, in case it is specified.
## Who can review?
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29131/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29131/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29131",
"html_url": "https://github.com/huggingface/transformers/pull/29131",
"diff_url": "https://github.com/huggingface/transformers/pull/29131.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29131.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29130 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29130/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29130/comments | https://api.github.com/repos/huggingface/transformers/issues/29130/events | https://github.com/huggingface/transformers/issues/29130 | 2,143,788,296 | I_kwDOCUB6oc5_x50I | 29,130 | Move kernel compilation to init rather than at import stage | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/use... | [
{
"id": 1862634478,
"node_id": "MDU6TGFiZWwxODYyNjM0NDc4",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Should%20Fix",
"name": "Should Fix",
"color": "FF0000",
"default": false,
"description": "This has been identified as a bug and should be fixed."
},
{
"... | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | ### Feature request
Some models like Deformable DETR rely on custom CUDA kernels to be compiled as seen [here](https://github.com/huggingface/transformers/blob/f7ef7cec6c6c162087421f36a17eabdbb223579d/src/transformers/models/deformable_detr/modeling_deformable_detr.py#L54).
Currently these are compiled when importi... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29130/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29130/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29129 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29129/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29129/comments | https://api.github.com/repos/huggingface/transformers/issues/29129/events | https://github.com/huggingface/transformers/issues/29129 | 2,143,773,084 | I_kwDOCUB6oc5_x2Gc | 29,129 | Flash attention implementation with BERT base model | {
"login": "ghost",
"id": 10137,
"node_id": "MDQ6VXNlcjEwMTM3",
"avatar_url": "https://avatars.githubusercontent.com/u/10137?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ghost",
"html_url": "https://github.com/ghost",
"followers_url": "https://api.github.com/users/ghost/followers",
"f... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | [
"Not that expert but I suggest you can try bettertransformer for extreme speed up. ( In my knowledge that flash-attn is mainly focused on kv cache which is not exist on Bert-like model in most cases. )",
"> Not that expert but I suggest you can try bettertransformer for extreme speed up. ( In my knowledge that fl... | 1,708 | 1,708 | null | NONE | null | ### Model description
hello and thanks community.
I am trying to replace standard attention by flash attention in the BERT base Model. Anyone please help not able to find any tutorial or any discussions.
or just give some directions how to do that ..I have got the idea of making attention prob drop prob = 0 . it m... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29129/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29129/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29128 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29128/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29128/comments | https://api.github.com/repos/huggingface/transformers/issues/29128/events | https://github.com/huggingface/transformers/issues/29128 | 2,143,692,799 | I_kwDOCUB6oc5_xif_ | 29,128 | bart-large-xsum model: There were missing keys in the checkpoint model loaded: ['model.encoder.embed_tokens.weight', 'model.decoder.embed_tokens.weight', 'lm_head.weight']. | {
"login": "Aisuko",
"id": 8053949,
"node_id": "MDQ6VXNlcjgwNTM5NDk=",
"avatar_url": "https://avatars.githubusercontent.com/u/8053949?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Aisuko",
"html_url": "https://github.com/Aisuko",
"followers_url": "https://api.github.com/users/Aisuko/foll... | [] | open | false | null | [] | [
"cc @ArthurZucker @younesbelkada ",
"Hey @Aisuko, could you provide a **minimal** reproducer ? That would help use! \r\nAlso note that the `generation parameters` issues can probably be safely ignored. The missing keys is however a bit more problematic! \r\nMight be tied weights that are not tied properly, is `ti... | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-5.15.133+-x86_64-with-glibc2.31
- Python version: 3.10.13
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.26.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.2 (True)
- Tensorflow version ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29128/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29128/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29127 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29127/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29127/comments | https://api.github.com/repos/huggingface/transformers/issues/29127/events | https://github.com/huggingface/transformers/issues/29127 | 2,143,620,996 | I_kwDOCUB6oc5_xQ-E | 29,127 | err_handle(layoutlmv3): Error message doesn't give much clarity when boxes not containing enough information | {
"login": "Sushaanth-Suresh-Kumar",
"id": 123300765,
"node_id": "U_kgDOB1lrnQ",
"avatar_url": "https://avatars.githubusercontent.com/u/123300765?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Sushaanth-Suresh-Kumar",
"html_url": "https://github.com/Sushaanth-Suresh-Kumar",
"followers_url... | [] | open | false | null | [] | [
"Would you like to open a PR to improve the error? 🤗 ",
"Sure"
] | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Windows-10-10.0.22000-SP0
- Python version: 3.11.5
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.2.0+cpu (False)
- Tensorflow version (G... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29127/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29127/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29126 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29126/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29126/comments | https://api.github.com/repos/huggingface/transformers/issues/29126/events | https://github.com/huggingface/transformers/issues/29126 | 2,143,539,045 | I_kwDOCUB6oc5_w89l | 29,126 | WARNING: tokenization mismatch: 43 vs. 44. (ignored) | {
"login": "lucasjinreal",
"id": 21303438,
"node_id": "MDQ6VXNlcjIxMzAzNDM4",
"avatar_url": "https://avatars.githubusercontent.com/u/21303438?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lucasjinreal",
"html_url": "https://github.com/lucasjinreal",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"Hi @lucasjinreal, \r\n\r\nWithout a code sample to replicate, information about the running environment or more information about the error - including full trackback - there isn't much we can do to help you here."
] | 1,708 | 1,708 | null | NONE | null | Recently there are many errors got either from fastchat or llava code base if using latest transfomers.
WARNING: tokenization mismatch: 43 vs. 44. (ignored)
What does this happen and how to dismiss it? Will it effect the final training result? | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29126/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29126/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29125 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29125/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29125/comments | https://api.github.com/repos/huggingface/transformers/issues/29125/events | https://github.com/huggingface/transformers/pull/29125 | 2,143,504,797 | PR_kwDOCUB6oc5nWUBE | 29,125 | feat: Upgrade Weights & Biases callback | {
"login": "parambharat",
"id": 12809212,
"node_id": "MDQ6VXNlcjEyODA5MjEy",
"avatar_url": "https://avatars.githubusercontent.com/u/12809212?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/parambharat",
"html_url": "https://github.com/parambharat",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
This PR adds a few new functionalities to the Weights & Biases Callback
- Logs Peft and Lora Config to wandb if present
- Adds model parameter counts to wandb config and artifact metadata
- Adds on_predict methods to log prediction metrics
- Prints the model architecture to a file alongsi... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29125/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29125/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29125",
"html_url": "https://github.com/huggingface/transformers/pull/29125",
"diff_url": "https://github.com/huggingface/transformers/pull/29125.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29125.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29124 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29124/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29124/comments | https://api.github.com/repos/huggingface/transformers/issues/29124/events | https://github.com/huggingface/transformers/pull/29124 | 2,143,420,111 | PR_kwDOCUB6oc5nWBoW | 29,124 | added unrolled whisper_generation.py | {
"login": "robertgshaw2-neuralmagic",
"id": 114415538,
"node_id": "U_kgDOBtHXsg",
"avatar_url": "https://avatars.githubusercontent.com/u/114415538?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/robertgshaw2-neuralmagic",
"html_url": "https://github.com/robertgshaw2-neuralmagic",
"followe... | [] | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | NONE | null | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29124/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29124/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29124",
"html_url": "https://github.com/huggingface/transformers/pull/29124",
"diff_url": "https://github.com/huggingface/transformers/pull/29124.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29124.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29123 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29123/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29123/comments | https://api.github.com/repos/huggingface/transformers/issues/29123/events | https://github.com/huggingface/transformers/pull/29123 | 2,143,416,822 | PR_kwDOCUB6oc5nWA8d | 29,123 | [`Core generation`] Let's be less restrictive on the arguments passed to the generation calls. | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29123). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | COLLABORATOR | null | # What does this PR do?
Updates generate calls | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29123/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29123/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29123",
"html_url": "https://github.com/huggingface/transformers/pull/29123",
"diff_url": "https://github.com/huggingface/transformers/pull/29123.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29123.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29122 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29122/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29122/comments | https://api.github.com/repos/huggingface/transformers/issues/29122/events | https://github.com/huggingface/transformers/pull/29122 | 2,143,413,555 | PR_kwDOCUB6oc5nWARN | 29,122 | FIX [`bnb` / `tests`] Propagate the changes from #29092 to 4-bit tests | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29122). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
As per title, I overlooked the fix and forgot to push the changes of https://github.com/huggingface/transformers/pull/29092 in 4-bit tests 😢
cc @amyeroberts @Titus-von-Koeller | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29122/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29122/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29122",
"html_url": "https://github.com/huggingface/transformers/pull/29122",
"diff_url": "https://github.com/huggingface/transformers/pull/29122.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29122.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29121 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29121/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29121/comments | https://api.github.com/repos/huggingface/transformers/issues/29121/events | https://github.com/huggingface/transformers/issues/29121 | 2,143,187,142 | I_kwDOCUB6oc5_vnDG | 29,121 | AttributeError: 'DistilBertModel' object has no attribute '_use_flash_attention_2' | {
"login": "javilonso",
"id": 31996659,
"node_id": "MDQ6VXNlcjMxOTk2NjU5",
"avatar_url": "https://avatars.githubusercontent.com/u/31996659?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/javilonso",
"html_url": "https://github.com/javilonso",
"followers_url": "https://api.github.com/users/... | [] | open | false | null | [] | [
"Hi @javilonso ! \r\nI quickly tried on transformers main: \r\n```python\r\nfrom transformers import pipeline\r\n\r\nunmasker = pipeline('fill-mask', model='distilbert-base-uncased')\r\nunmasker(\"Hello I'm a [MASK] model.\")\r\n```\r\nBut I did not managed to repro, can you share a snippet to reproduce the issue?\... | 1,708 | 1,708 | null | NONE | null | ### System Info
Obtaining this error in last transformers 4.37.2, but works correctly in transformers 4.35.2
Simple inference with a finetuned distilbert model.
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supporte... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29121/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29121/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29120 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29120/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29120/comments | https://api.github.com/repos/huggingface/transformers/issues/29120/events | https://github.com/huggingface/transformers/pull/29120 | 2,143,042,742 | PR_kwDOCUB6oc5nUwcG | 29,120 | Starcoder2 model | {
"login": "jlamypoirier",
"id": 18523627,
"node_id": "MDQ6VXNlcjE4NTIzNjI3",
"avatar_url": "https://avatars.githubusercontent.com/u/18523627?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jlamypoirier",
"html_url": "https://github.com/jlamypoirier",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | CONTRIBUTOR | null | The Starcoder2 model, adapted from Mistral. All changes are done through options, so Mistral itself is still supported. Main changes:
* Use layer norm (RMS still available as option)
* Use standard MLP (gated still available as option)
* Add back biases (optional)
* Change (default?) tokenizer class
*Embedding and... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29120/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29120/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29120",
"html_url": "https://github.com/huggingface/transformers/pull/29120",
"diff_url": "https://github.com/huggingface/transformers/pull/29120.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29120.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29119 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29119/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29119/comments | https://api.github.com/repos/huggingface/transformers/issues/29119/events | https://github.com/huggingface/transformers/pull/29119 | 2,143,005,049 | PR_kwDOCUB6oc5nUoNF | 29,119 | Generate: unset GenerationConfig parameters do not raise warning | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29119). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | MEMBER | null | # What does this PR do?:
Thank you @fxmarty for raising [this issue](https://github.com/huggingface/transformers/pull/25381#issuecomment-1952527813).
This PR allows users to unset (= set to `None`) unused parameters to ensure `generation_config.validate()` doesn't throw a warning. Previously, this was not possibl... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29119/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29119/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29119",
"html_url": "https://github.com/huggingface/transformers/pull/29119",
"diff_url": "https://github.com/huggingface/transformers/pull/29119.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29119.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29118 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29118/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29118/comments | https://api.github.com/repos/huggingface/transformers/issues/29118/events | https://github.com/huggingface/transformers/pull/29118 | 2,142,996,665 | PR_kwDOCUB6oc5nUmWT | 29,118 | Skipping test_save_load_low_cpu_mem_usage() for all failing models | {
"login": "hackyon",
"id": 1557853,
"node_id": "MDQ6VXNlcjE1NTc4NTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1557853?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hackyon",
"html_url": "https://github.com/hackyon",
"followers_url": "https://api.github.com/users/hackyon/... | [] | open | false | null | [] | [
"Hello @amyeroberts,\r\n\r\nI came up this PR to simply ignores all the failing tests for test_save_load_low_cpu_mem_usage(). This change should be safe as it only touches tests.\r\n\r\nThis should help unblock any PRs from being merged, while we work on getting tie_weights() into some of these models with #29024.\... | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
Skips all failing unit tests for test_save_load_low_cpu_mem_usage(). This should be temporary for some of them until the correct tie_weights() have been added to the models.
I created this temporary PR just in case it takes longer to make progress with #29024, and we don't want to block ot... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29118/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29118/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29118",
"html_url": "https://github.com/huggingface/transformers/pull/29118",
"diff_url": "https://github.com/huggingface/transformers/pull/29118.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29118.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29117 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29117/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29117/comments | https://api.github.com/repos/huggingface/transformers/issues/29117/events | https://github.com/huggingface/transformers/pull/29117 | 2,142,967,049 | PR_kwDOCUB6oc5nUf4p | 29,117 | Move misplaced line | {
"login": "kno10",
"id": 3997899,
"node_id": "MDQ6VXNlcjM5OTc4OTk=",
"avatar_url": "https://avatars.githubusercontent.com/u/3997899?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kno10",
"html_url": "https://github.com/kno10",
"followers_url": "https://api.github.com/users/kno10/follower... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29117). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Move misplaced line, improve code comment.
No functional change, the loss_fct is not used earlier and did not match the code comment either.
## Before submitting
- [X] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
## Who can review?
@ArthurZucker and @youn... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29117/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29117/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29117",
"html_url": "https://github.com/huggingface/transformers/pull/29117",
"diff_url": "https://github.com/huggingface/transformers/pull/29117.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29117.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29116 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29116/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29116/comments | https://api.github.com/repos/huggingface/transformers/issues/29116/events | https://github.com/huggingface/transformers/pull/29116 | 2,142,944,502 | PR_kwDOCUB6oc5nUbC6 | 29,116 | Track each row separately for stopping criteria | {
"login": "zucchini-nlp",
"id": 100715397,
"node_id": "U_kgDOBgDLhQ",
"avatar_url": "https://avatars.githubusercontent.com/u/100715397?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/zucchini-nlp",
"html_url": "https://github.com/zucchini-nlp",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29116). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Yep, should be unrelated to stopping criteria but I will check ",
"@gante , I fo... | 1,708 | 1,708 | null | MEMBER | null | # What does this PR do?
Addresses the question raised in #28932. I accidentally messed up the first PR (#29056) that was approved, so this is the second version with the same changes.
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29116/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29116/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29116",
"html_url": "https://github.com/huggingface/transformers/pull/29116",
"diff_url": "https://github.com/huggingface/transformers/pull/29116.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29116.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29115 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29115/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29115/comments | https://api.github.com/repos/huggingface/transformers/issues/29115/events | https://github.com/huggingface/transformers/pull/29115 | 2,142,911,771 | PR_kwDOCUB6oc5nUT_w | 29,115 | Switch transformer for sequence classification | {
"login": "jlamprou",
"id": 41962910,
"node_id": "MDQ6VXNlcjQxOTYyOTEw",
"avatar_url": "https://avatars.githubusercontent.com/u/41962910?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jlamprou",
"html_url": "https://github.com/jlamprou",
"followers_url": "https://api.github.com/users/jla... | [] | open | false | null | [] | [
"> Hey! Thanks for contributing. As there are no released checkpoints for sequence classification, we usually try to:\r\n> \r\n> 1. Open an issue with the feature request\r\n> \r\n> 2. If the issue has strong support from the community (usually around 10 likes for example) the add support for it 🤗\r\n> \r\... | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
This adds a sequence classification head to the PyTorch implementation of SwitchTransformers, following the pattern of T5ForSequenceClassification since it is also an encoder-decoder sequence classification model.
# NOTE:
- [ ] Failing tests, because we haven't a Checkpoint trained on class... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29115/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29115/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29115",
"html_url": "https://github.com/huggingface/transformers/pull/29115",
"diff_url": "https://github.com/huggingface/transformers/pull/29115.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29115.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29114 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29114/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29114/comments | https://api.github.com/repos/huggingface/transformers/issues/29114/events | https://github.com/huggingface/transformers/pull/29114 | 2,142,818,811 | PR_kwDOCUB6oc5nT_rS | 29,114 | Make torch.compile compilation >2x faster when using static cache + `generate` | {
"login": "fxmarty",
"id": 9808326,
"node_id": "MDQ6VXNlcjk4MDgzMjY=",
"avatar_url": "https://avatars.githubusercontent.com/u/9808326?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/fxmarty",
"html_url": "https://github.com/fxmarty",
"followers_url": "https://api.github.com/users/fxmarty/... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29114). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"@ArthurZucker @gante @LysandreJik This PR fixes many issues with the current `torc... | 1,708 | 1,708 | null | COLLABORATOR | null | This PR improves the compilation time of llama model with `torch.compile` when using `generate`, avoiding recompilation, recaptures of CUDA graphs & a bug in PyTorch w.r.t. indexing. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29114/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 1
} | https://api.github.com/repos/huggingface/transformers/issues/29114/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29114",
"html_url": "https://github.com/huggingface/transformers/pull/29114",
"diff_url": "https://github.com/huggingface/transformers/pull/29114.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29114.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29113 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29113/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29113/comments | https://api.github.com/repos/huggingface/transformers/issues/29113/events | https://github.com/huggingface/transformers/issues/29113 | 2,142,798,975 | I_kwDOCUB6oc5_uIR_ | 29,113 | ValueError: lags cannot go further than history length, found lag 37 while history length is only 16 | {
"login": "nikhilajoshy",
"id": 37141775,
"node_id": "MDQ6VXNlcjM3MTQxNzc1",
"avatar_url": "https://avatars.githubusercontent.com/u/37141775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikhilajoshy",
"html_url": "https://github.com/nikhilajoshy",
"followers_url": "https://api.github.c... | [] | open | false | {
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/kashif/followers",
... | [
{
"login": "kashif",
"id": 8100,
"node_id": "MDQ6VXNlcjgxMDA=",
"avatar_url": "https://avatars.githubusercontent.com/u/8100?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/kashif",
"html_url": "https://github.com/kashif",
"followers_url": "https://api.github.com/users/k... | [
"cc @kashif @NielsRogge ",
"@nikhilajoshy can you kindly paste in some more verbose error?",
"@kashif \r\n```\r\noutputs = model(\r\n ^^^^^^\r\n File \"/home/nikhila/encdec_venv/lib/python3.11/site-packages/torch/nn/modules/module.py\", line 1511, in _wrapped_call_impl\r\n return self._call_im... | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-5.15.0-94-generic-x86_64-with-glibc2.31
- Python version: 3.11.7
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.27.2
- Accelerate config: not found
- PyTorch version (GPU?): 2.2.0+cu121 (True)
- Tensor... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29113/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29113/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29112 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29112/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29112/comments | https://api.github.com/repos/huggingface/transformers/issues/29112/events | https://github.com/huggingface/transformers/pull/29112 | 2,142,709,791 | PR_kwDOCUB6oc5nTn1C | 29,112 | Remove static pretrained maps from the library's internals | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"Before this ungodly PR gets merged, I need to check that every checkpoint referenced here behaves the same once its pretrained map has been removed. \r\n\r\nI'll link the PRs open as a result in this comment.\r\n\r\n🟣: merged\r\n🟢: open\r\n🔴: closed\r\n🟡: not open yet\r\n\r\n## Repos with PRs opened\r\n\r\n###... | 1,708 | 1,708 | null | MEMBER | null | null | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29112/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29112/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29112",
"html_url": "https://github.com/huggingface/transformers/pull/29112",
"diff_url": "https://github.com/huggingface/transformers/pull/29112.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29112.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29111 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29111/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29111/comments | https://api.github.com/repos/huggingface/transformers/issues/29111/events | https://github.com/huggingface/transformers/issues/29111 | 2,142,682,356 | I_kwDOCUB6oc5_trz0 | 29,111 | RWKV5 tokenizer truncation | {
"login": "sedrick-keh-tri",
"id": 133716510,
"node_id": "U_kgDOB_haHg",
"avatar_url": "https://avatars.githubusercontent.com/u/133716510?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sedrick-keh-tri",
"html_url": "https://github.com/sedrick-keh-tri",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [
"cc @ArthurZucker ",
"Hey! THanks for opening an issue. The problem is that you are using `trust_remote_code=True` and thus the `https://huggingface.co/RWKV/v5-Eagle-7B-HF/blob/main/tokenization_rwkv_world.py` file is used. The code is not on transformers yet! \r\n\r\nUse the tokenizer from #29095 should fix this... | 1,708 | 1,708 | null | NONE | null | ### System Info
Python 3.10. Transformers 4.37.2
### Who can help?
@arthur
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reprod... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29111/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29111/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29110 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29110/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29110/comments | https://api.github.com/repos/huggingface/transformers/issues/29110/events | https://github.com/huggingface/transformers/pull/29110 | 2,142,647,107 | PR_kwDOCUB6oc5nTaYs | 29,110 | Skip failing test_save_load_low_cpu_mem_usage tests | {
"login": "amyeroberts",
"id": 22614925,
"node_id": "MDQ6VXNlcjIyNjE0OTI1",
"avatar_url": "https://avatars.githubusercontent.com/u/22614925?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/amyeroberts",
"html_url": "https://github.com/amyeroberts",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29110). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"I also in favor of reverting the original PR #28948 - if that PR is not something ... | 1,708 | 1,708 | null | COLLABORATOR | null | # What does this PR do?
Related to #29043 and #28948
Fixes more failing model tests on main which weren't picked up by the test fetcher cc @ydshieh cc @ArthurZucker | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29110/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29110/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29110",
"html_url": "https://github.com/huggingface/transformers/pull/29110",
"diff_url": "https://github.com/huggingface/transformers/pull/29110.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29110.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29109 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29109/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29109/comments | https://api.github.com/repos/huggingface/transformers/issues/29109/events | https://github.com/huggingface/transformers/pull/29109 | 2,142,417,003 | PR_kwDOCUB6oc5nSns8 | 29,109 | Llama: fix batched generation | {
"login": "gante",
"id": 12240844,
"node_id": "MDQ6VXNlcjEyMjQwODQ0",
"avatar_url": "https://avatars.githubusercontent.com/u/12240844?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gante",
"html_url": "https://github.com/gante",
"followers_url": "https://api.github.com/users/gante/follow... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29109). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"I'll have to run the benchmark on the A100 to make sure everything is alright but ... | 1,708 | 1,708 | 1,708 | MEMBER | null | # What does this PR do?
Fixes batched inference on llama, after the static cache changes were added. For instance, `RUN_SLOW=1 py.test tests/test_cache_utils.py::CacheIntegrationTest::test_dynamic_cache_beam_search` now passes.
### What was wrong?
`position_ids` has shape `[bsz, seq_len]`. The line computing `fr... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29109/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29109/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29109",
"html_url": "https://github.com/huggingface/transformers/pull/29109",
"diff_url": "https://github.com/huggingface/transformers/pull/29109.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29109.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29108 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29108/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29108/comments | https://api.github.com/repos/huggingface/transformers/issues/29108/events | https://github.com/huggingface/transformers/pull/29108 | 2,142,388,605 | PR_kwDOCUB6oc5nShdP | 29,108 | [Phi] Add support for sdpa | {
"login": "hackyon",
"id": 1557853,
"node_id": "MDQ6VXNlcjE1NTc4NTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1557853?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hackyon",
"html_url": "https://github.com/hackyon",
"followers_url": "https://api.github.com/users/hackyon/... | [] | closed | false | null | [] | [
"Hey @gugarosa @ArthurZucker @younesbelkada 👋\r\n\r\nI'm looking for more places to add support for SDPA and figured Phi-2 could be a good one. \r\n\r\nBeen reading up on the issues regarding attention overflow for Phi-2 (#28673, #28488), and I think SPDA would probably be affected by it as well (if it chooses the... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
Adding support for SDPA to Phi (See #28005)
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#c... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29108/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29108/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29108",
"html_url": "https://github.com/huggingface/transformers/pull/29108",
"diff_url": "https://github.com/huggingface/transformers/pull/29108.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29108.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29107 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29107/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29107/comments | https://api.github.com/repos/huggingface/transformers/issues/29107/events | https://github.com/huggingface/transformers/issues/29107 | 2,142,291,862 | I_kwDOCUB6oc5_sMeW | 29,107 | Cannot use time series transformer as encoder and gpt model as decoder using encoder decoder architecture from hugging face | {
"login": "nikhilajoshy",
"id": 37141775,
"node_id": "MDQ6VXNlcjM3MTQxNzc1",
"avatar_url": "https://avatars.githubusercontent.com/u/37141775?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/nikhilajoshy",
"html_url": "https://github.com/nikhilajoshy",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"Hi @nikhilajoshy, thanks for raising an issue! \r\n\r\nThis is a question best placed in our [forums](https://huggingface.co/proxy/discuss.huggingface.co/). We try to reserve the github issues for feature requests and bug reports."
] | 1,708 | 1,708 | null | NONE | null | ### System Info
-
### Who can help?
_No response_
### Information
- [ ] The official example scripts
- [ ] My own modified scripts
### Tasks
- [ ] An officially supported task in the `examples` folder (such as GLUE/SQuAD, ...)
- [ ] My own task or dataset (give details below)
### Reproduction
EncoderDecoderMod... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29107/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29107/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29106 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29106/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29106/comments | https://api.github.com/repos/huggingface/transformers/issues/29106/events | https://github.com/huggingface/transformers/pull/29106 | 2,142,285,524 | PR_kwDOCUB6oc5nSKnH | 29,106 | support SDPA Attention in stablelm | {
"login": "eaidova",
"id": 29454499,
"node_id": "MDQ6VXNlcjI5NDU0NDk5",
"avatar_url": "https://avatars.githubusercontent.com/u/29454499?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/eaidova",
"html_url": "https://github.com/eaidova",
"followers_url": "https://api.github.com/users/eaidov... | [] | open | false | null | [] | [
"cc @fxmarty ",
"> Looks alright but we need to add an integration test iMO :)\r\n\r\nadded",
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29106). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the la... | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
enable SDPA attention in stablelm architecture
## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [x] Did you read the [contributor guideline](https://github.com/huggingface/transformers/blob/main/CONTRIBUTING.md#cr... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29106/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29106/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29106",
"html_url": "https://github.com/huggingface/transformers/pull/29106",
"diff_url": "https://github.com/huggingface/transformers/pull/29106.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29106.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29105 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29105/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29105/comments | https://api.github.com/repos/huggingface/transformers/issues/29105/events | https://github.com/huggingface/transformers/pull/29105 | 2,142,229,012 | PR_kwDOCUB6oc5nR-Lw | 29,105 | Fix the `bert-base-cased` tokenizer configuration test | {
"login": "LysandreJik",
"id": 30755778,
"node_id": "MDQ6VXNlcjMwNzU1Nzc4",
"avatar_url": "https://avatars.githubusercontent.com/u/30755778?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LysandreJik",
"html_url": "https://github.com/LysandreJik",
"followers_url": "https://api.github.com/... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29105). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | MEMBER | null | In the process of updating the tokenizer configurations on the Hub, this test needs to be updated to reflect the new value of the configuration. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29105/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29105/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29105",
"html_url": "https://github.com/huggingface/transformers/pull/29105",
"diff_url": "https://github.com/huggingface/transformers/pull/29105.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29105.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29104 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29104/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29104/comments | https://api.github.com/repos/huggingface/transformers/issues/29104/events | https://github.com/huggingface/transformers/pull/29104 | 2,142,219,632 | PR_kwDOCUB6oc5nR8G6 | 29,104 | Added image_captioning version in es and included in toctree file | {
"login": "gisturiz",
"id": 48292332,
"node_id": "MDQ6VXNlcjQ4MjkyMzMy",
"avatar_url": "https://avatars.githubusercontent.com/u/48292332?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/gisturiz",
"html_url": "https://github.com/gisturiz",
"followers_url": "https://api.github.com/users/gis... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29104). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
Translated image_captioning from en to es from issue https://github.com/huggingface/transformers/issues/28936 began by @stevhliu. I will continue to go through the documentation and make the correct translations.
(closed previous PR and reopened to rebasing issue)
Fixes # https://github... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29104/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29104/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29104",
"html_url": "https://github.com/huggingface/transformers/pull/29104",
"diff_url": "https://github.com/huggingface/transformers/pull/29104.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29104.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29103 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29103/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29103/comments | https://api.github.com/repos/huggingface/transformers/issues/29103/events | https://github.com/huggingface/transformers/issues/29103 | 2,142,189,413 | I_kwDOCUB6oc5_rzdl | 29,103 | Request to add FLMR | {
"login": "LinWeizheDragon",
"id": 33350454,
"node_id": "MDQ6VXNlcjMzMzUwNDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/33350454?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LinWeizheDragon",
"html_url": "https://github.com/LinWeizheDragon",
"followers_url": "https://api... | [
{
"id": 1843244711,
"node_id": "MDU6TGFiZWwxODQzMjQ0NzEx",
"url": "https://api.github.com/repos/huggingface/transformers/labels/New%20model",
"name": "New model",
"color": "fbca04",
"default": false,
"description": ""
}
] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | ### Model description
## Basic Information
This issue requests adding Fine-grained Late-interaction Multi-modal Retriever (FLMR).
The model leverages late interaction (as originally proposed by Stanford [ColBERT](https://github.com/stanford-futuredata/ColBERT)) to compute token-level similarity between every que... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29103/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29103/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29102 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29102/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29102/comments | https://api.github.com/repos/huggingface/transformers/issues/29102/events | https://github.com/huggingface/transformers/pull/29102 | 2,141,970,710 | PR_kwDOCUB6oc5nRFh9 | 29,102 | Fix two tiny typos in `pipelines/base.py::Pipeline::_sanitize_parameters()`'s docstring | {
"login": "sadra-barikbin",
"id": 22097587,
"node_id": "MDQ6VXNlcjIyMDk3NTg3",
"avatar_url": "https://avatars.githubusercontent.com/u/22097587?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/sadra-barikbin",
"html_url": "https://github.com/sadra-barikbin",
"followers_url": "https://api.gi... | [] | closed | false | null | [] | [
"One more thing. We have `False` for default value of `clean_up_tokenization_spaces` argument in `TextGenerationPipeline`'s docstring:\r\nhttps://github.com/huggingface/transformers/blob/593230f0a1150ea9c0477b9d859f25daf73c8c33/src/transformers/pipelines/text_generation.py#L207-L208\r\n\r\nbut its default value in ... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Hi there!
To fix two tiny typos in `pipelines/base.py::Pipeline::_sanitize_parameters()`'s docstring.
@Narsil
| {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29102/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29102/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29102",
"html_url": "https://github.com/huggingface/transformers/pull/29102",
"diff_url": "https://github.com/huggingface/transformers/pull/29102.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29102.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29101 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29101/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29101/comments | https://api.github.com/repos/huggingface/transformers/issues/29101/events | https://github.com/huggingface/transformers/issues/29101 | 2,141,799,722 | I_kwDOCUB6oc5_qUUq | 29,101 | Models with remote code are not loaded correctly when there's `.` in their name. | {
"login": "BlackSamorez",
"id": 16901341,
"node_id": "MDQ6VXNlcjE2OTAxMzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BlackSamorez",
"html_url": "https://github.com/BlackSamorez",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"cc @Rocketknight1 ",
"Hi @BlackSamorez, this issue has been reported already at #28919. We're working on a fix right now! I'm going to close this issue as a duplicate."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | ### System Info
```
- `transformers` version: 4.37.0
- Platform: Linux-6.1.58+-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.28.0.dev0
- Accelerate config: not found
- PyTorch version (GPU?): 2.2.0+cu121 (True)
- Tensorf... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29101/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29101/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29100 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29100/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29100/comments | https://api.github.com/repos/huggingface/transformers/issues/29100/events | https://github.com/huggingface/transformers/issues/29100 | 2,141,751,170 | I_kwDOCUB6oc5_qIeC | 29,100 | Getting Assertion Error when calling neo4j chain for inference | {
"login": "KaifAhmad1",
"id": 98801504,
"node_id": "U_kgDOBeOXYA",
"avatar_url": "https://avatars.githubusercontent.com/u/98801504?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/KaifAhmad1",
"html_url": "https://github.com/KaifAhmad1",
"followers_url": "https://api.github.com/users/KaifA... | [] | open | false | null | [] | [
"Hi @KaifAhmad1, thanks for opening an issue! \r\n\r\nPlease make sure to provide a minimal code reproducer and information about the bug encountered, including the full error traceback when reporting an issue.\r\n\r\nIf the error is coming from `bitsandbytes` there isn't anything the transformers team can do. ",
... | 1,708 | 1,708 | null | NONE | null | ### System Info
langchain version = 0.1.7
bitsandbytes = 0.42.0
pip = 24.0
cuda = 12.1
OS Windows 11 x64
### Who can help?
Hey, @SunMarc @younesbelkada please help me out.
### Information
- [ ] The official example scripts
- [X] My own modified scripts
### Tasks
- [ ] An officially supported task in the `ex... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29100/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29100/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29099 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29099/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29099/comments | https://api.github.com/repos/huggingface/transformers/issues/29099/events | https://github.com/huggingface/transformers/pull/29099 | 2,141,668,363 | PR_kwDOCUB6oc5nQDN9 | 29,099 | Fix the behavior of collecting 'num_input_tokens_seen' | {
"login": "YouliangHUANG",
"id": 56789071,
"node_id": "MDQ6VXNlcjU2Nzg5MDcx",
"avatar_url": "https://avatars.githubusercontent.com/u/56789071?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/YouliangHUANG",
"html_url": "https://github.com/YouliangHUANG",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
The length of "inputs[main_input_name]" is not guaranteed to be the same when using DDP, which may make the training process hang. Besides, in a distributed setup, it costs a lot to gather the WHOLE input tensors on different workers. It is better to call .numel() first and then .gather().
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29099/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29099/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29099",
"html_url": "https://github.com/huggingface/transformers/pull/29099",
"diff_url": "https://github.com/huggingface/transformers/pull/29099.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29099.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29098 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29098/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29098/comments | https://api.github.com/repos/huggingface/transformers/issues/29098/events | https://github.com/huggingface/transformers/issues/29098 | 2,141,628,529 | I_kwDOCUB6oc5_pqhx | 29,098 | Flashatten2 avaiable should handle if hardware support or not? | {
"login": "lucasjinreal",
"id": 21303438,
"node_id": "MDQ6VXNlcjIxMzAzNDM4",
"avatar_url": "https://avatars.githubusercontent.com/u/21303438?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/lucasjinreal",
"html_url": "https://github.com/lucasjinreal",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"Hi @lucasjinreal, thanks for opening an issue! \r\n\r\nPlease make sure to follow the [issue template](https://github.com/huggingface/transformers/blob/main/.github/ISSUE_TEMPLATE/bug-report.yml) and provide: \r\n* A minimal code reproducer\r\n* All relevant error information including the full error traceback\r\n... | 1,708 | 1,708 | null | NONE | null | For some docker images the flashattn installed, but v100 not support.
It will return True and raise error in runtime.
Also, since torch already support flash2 inside, add torch2.2 check and using inside version? | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29098/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29098/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29097 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29097/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29097/comments | https://api.github.com/repos/huggingface/transformers/issues/29097/events | https://github.com/huggingface/transformers/pull/29097 | 2,141,556,123 | PR_kwDOCUB6oc5nPqeA | 29,097 | change version | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29097). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"We still have failing for `test_save_load_low_cpu_mem_usage` which are already on ... | 1,708 | 1,708 | 1,708 | COLLABORATOR | null | # What does this PR do?
Try to change cached version | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29097/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29097/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29097",
"html_url": "https://github.com/huggingface/transformers/pull/29097",
"diff_url": "https://github.com/huggingface/transformers/pull/29097.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29097.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29096 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29096/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29096/comments | https://api.github.com/repos/huggingface/transformers/issues/29096/events | https://github.com/huggingface/transformers/pull/29096 | 2,141,530,987 | PR_kwDOCUB6oc5nPk6E | 29,096 | Fix: Fixed the previous tracking URI setting logic to prevent clashes with original MLflow code. | {
"login": "seanswyi",
"id": 20367759,
"node_id": "MDQ6VXNlcjIwMzY3NzU5",
"avatar_url": "https://avatars.githubusercontent.com/u/20367759?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/seanswyi",
"html_url": "https://github.com/seanswyi",
"followers_url": "https://api.github.com/users/sea... | [] | open | false | null | [] | [
"@seanswyi For the failing tests, these are unrelated and a known issue happening on our CI. We're currently working on a fix for it and will let you know asap once it's merged and you can rebase to get the CI green",
"@amyeroberts No worries, thanks for the heads up!",
"@seanswyi The failing tests should now b... | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
The previous code was calling the `mlflow.set_tracking_uri` function regardless of whether or not the environment variable `MLFLOW_TRACKING_URI` is even set. This led to clashes with the original MLflow implementation and therefore the logic was changed to only calling the function when the e... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29096/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29096/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29096",
"html_url": "https://github.com/huggingface/transformers/pull/29096",
"diff_url": "https://github.com/huggingface/transformers/pull/29096.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29096.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29095 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29095/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29095/comments | https://api.github.com/repos/huggingface/transformers/issues/29095/events | https://github.com/huggingface/transformers/pull/29095 | 2,141,352,273 | PR_kwDOCUB6oc5nO96x | 29,095 | [`RWKV5`] Add support for RWKV5 model | {
"login": "ArthurZucker",
"id": 48595927,
"node_id": "MDQ6VXNlcjQ4NTk1OTI3",
"avatar_url": "https://avatars.githubusercontent.com/u/48595927?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ArthurZucker",
"html_url": "https://github.com/ArthurZucker",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29095). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Not sure if this is the appropriate place to post this, but I've been running into... | 1,708 | 1,708 | null | COLLABORATOR | null | # What does this PR do?
Adds RWKV5, superseeds #26963 | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29095/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29095/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29095",
"html_url": "https://github.com/huggingface/transformers/pull/29095",
"diff_url": "https://github.com/huggingface/transformers/pull/29095.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29095.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29094 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29094/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29094/comments | https://api.github.com/repos/huggingface/transformers/issues/29094/events | https://github.com/huggingface/transformers/issues/29094 | 2,141,335,890 | I_kwDOCUB6oc5_ojFS | 29,094 | altclip can not be traced by fx? | {
"login": "TXacs",
"id": 60869411,
"node_id": "MDQ6VXNlcjYwODY5NDEx",
"avatar_url": "https://avatars.githubusercontent.com/u/60869411?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/TXacs",
"html_url": "https://github.com/TXacs",
"followers_url": "https://api.github.com/users/TXacs/follow... | [] | open | false | null | [] | [
"Hi @TXacs, thanks for raising this issue! \r\n\r\nYou need to pass in `input_names` to `symbolic_trace`:\r\n\r\n```py\r\nfrom config import load_config\r\n\r\nfrom transformers import (\r\n AltCLIPModel,\r\n AltCLIPConfig,\r\n)\r\nfrom transformers.utils.fx import symbolic_trace\r\n\r\ndef main():\r\n con... | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.37.1
- Platform: Linux-5.4.0-47-generic-x86_64-with-glibc2.17
- Python version: 3.8.13
- Huggingface_hub version: 0.19.4
- Safetensors version: 0.3.1
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.2+cu118 (True)
- Ten... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29094/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29094/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29093 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29093/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29093/comments | https://api.github.com/repos/huggingface/transformers/issues/29093/events | https://github.com/huggingface/transformers/issues/29093 | 2,141,302,021 | I_kwDOCUB6oc5_oa0F | 29,093 | Generation doesn't work as expected with input_embeds | {
"login": "dipta007",
"id": 13894030,
"node_id": "MDQ6VXNlcjEzODk0MDMw",
"avatar_url": "https://avatars.githubusercontent.com/u/13894030?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dipta007",
"html_url": "https://github.com/dipta007",
"followers_url": "https://api.github.com/users/dip... | [] | closed | false | null | [] | [
"@dipta007 ,hi! This [PR](https://github.com/huggingface/transformers/pull/28994) fixed it",
"@dipta007 as @zucchini-nlp wrote, if you add `!pip install --upgrade git+https://github.com/huggingface/transformers.git` on top of your notebook it will work 🤗 ",
"Thanks"
] | 1,708 | 1,708 | 1,708 | NONE | null | ### System Info
- `transformers` version: 4.35.2
- Platform: Linux-6.1.58+-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.0+cu121 (False)
- Tensorflow... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29093/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29093/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29092 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29092/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29092/comments | https://api.github.com/repos/huggingface/transformers/issues/29092/events | https://github.com/huggingface/transformers/pull/29092 | 2,141,298,539 | PR_kwDOCUB6oc5nOyRk | 29,092 | FIX [`bnb` / `tests`]: Fix currently failing bnb tests | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29092). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
https://github.com/huggingface/transformers/pull/29001 changed the logic of handling how to get linear layers from testing models. In fact, the `model_type` should always stay `"gpt2"` and not `"openai-community/gpt2"`
cc @amyeroberts @Titus-von-Koeller | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29092/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29092/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29092",
"html_url": "https://github.com/huggingface/transformers/pull/29092",
"diff_url": "https://github.com/huggingface/transformers/pull/29092.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29092.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29091 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29091/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29091/comments | https://api.github.com/repos/huggingface/transformers/issues/29091/events | https://github.com/huggingface/transformers/pull/29091 | 2,141,252,840 | PR_kwDOCUB6oc5nOoYV | 29,091 | fix the post-processing link | {
"login": "davies-w",
"id": 6550854,
"node_id": "MDQ6VXNlcjY1NTA4NTQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/6550854?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/davies-w",
"html_url": "https://github.com/davies-w",
"followers_url": "https://api.github.com/users/davie... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29091). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | The link in evaluation was missing a hyphen between post and processing. I fixed this, for English only. Someone with the ability to do a global search/replace should fix the other languages (if indeed they have this issue).
# What does this PR do?
Fixes a broken link in the documentation.
## Before submitt... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29091/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29091/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29091",
"html_url": "https://github.com/huggingface/transformers/pull/29091",
"diff_url": "https://github.com/huggingface/transformers/pull/29091.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29091.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29090 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29090/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29090/comments | https://api.github.com/repos/huggingface/transformers/issues/29090/events | https://github.com/huggingface/transformers/pull/29090 | 2,141,198,507 | PR_kwDOCUB6oc5nOdgh | 29,090 | Do not use pooling for squad conversions when thread == 1 | {
"login": "hackyon",
"id": 1557853,
"node_id": "MDQ6VXNlcjE1NTc4NTM=",
"avatar_url": "https://avatars.githubusercontent.com/u/1557853?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/hackyon",
"html_url": "https://github.com/hackyon",
"followers_url": "https://api.github.com/users/hackyon/... | [] | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Testing this on CI tool. Improvement seems to happen only for certain machines.
This significantly improves the time it takes for the qa pipeline test. For example, the bert torch tests went from ~90s to ~15s locally.
# What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite do... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29090/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29090/timeline | null | true | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29090",
"html_url": "https://github.com/huggingface/transformers/pull/29090",
"diff_url": "https://github.com/huggingface/transformers/pull/29090.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29090.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29089 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29089/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29089/comments | https://api.github.com/repos/huggingface/transformers/issues/29089/events | https://github.com/huggingface/transformers/issues/29089 | 2,141,162,788 | I_kwDOCUB6oc5_n40k | 29,089 | Caching image prototype embeddings for image-guided object detection using OWL-ViT | {
"login": "jakubhejhal",
"id": 97042178,
"node_id": "U_kgDOBci_Ag",
"avatar_url": "https://avatars.githubusercontent.com/u/97042178?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/jakubhejhal",
"html_url": "https://github.com/jakubhejhal",
"followers_url": "https://api.github.com/users/ja... | [] | closed | false | null | [] | [
"Hi, thanks for raising an issue! \r\n\r\nThis is a question best placed in our [forums](https://huggingface.co/proxy/discuss.huggingface.co/). We try to reserve the github issues for feature requests and bug reports."
] | 1,708 | 1,708 | 1,708 | NONE | null | ### Feature request
The [OWL-ViT](https://arxiv.org/abs/2205.06230) model currently supports image-guided one-shot object detection by using reference image embeddings as the input to the classification head instead of the text embedding. This is implemented by the [image_guided_detection](https://huggingface.co/docs/... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29089/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29089/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29088 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29088/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29088/comments | https://api.github.com/repos/huggingface/transformers/issues/29088/events | https://github.com/huggingface/transformers/pull/29088 | 2,141,149,538 | PR_kwDOCUB6oc5nOT2h | 29,088 | Remove misleading model disclaimers in docs for gpt2 and gpt neo QA. | {
"login": "Whenning42",
"id": 8920171,
"node_id": "MDQ6VXNlcjg5MjAxNzE=",
"avatar_url": "https://avatars.githubusercontent.com/u/8920171?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Whenning42",
"html_url": "https://github.com/Whenning42",
"followers_url": "https://api.github.com/users... | [] | open | false | null | [] | [
"Hi @Whenning42, thanks for opening this PR! \r\n\r\nCould you share what the effect is of removing the `real_checkpoint` argument? Does the disclaimer disappear? "
] | 1,708 | 1,708 | null | NONE | null | The removed disclaimers suggest the real checkpoints aren't correct and that they need to be replaced by themselves.
GPT 2 disclaimer:
"This example uses a random model as the real ones are all very big. To get proper results, you should use gpt2 instead of gpt2. If you get out-of-memory when loading that checkpoin... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29088/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29088/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29088",
"html_url": "https://github.com/huggingface/transformers/pull/29088",
"diff_url": "https://github.com/huggingface/transformers/pull/29088.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29088.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29087 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29087/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29087/comments | https://api.github.com/repos/huggingface/transformers/issues/29087/events | https://github.com/huggingface/transformers/issues/29087 | 2,141,128,881 | I_kwDOCUB6oc5_nwix | 29,087 | Mixtral inference breaks when `output_router_logits=True` | {
"login": "LeonardoEmili",
"id": 36575651,
"node_id": "MDQ6VXNlcjM2NTc1NjUx",
"avatar_url": "https://avatars.githubusercontent.com/u/36575651?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LeonardoEmili",
"html_url": "https://github.com/LeonardoEmili",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [
"When running inference you should set `model .config.output_router_logits=False`",
"Thanks @ArthurZucker, I believe it is a bit hard to spot the correct behaviour from the [docs](https://huggingface.co/docs/transformers/main/model_doc/mixtral#transformers.MixtralModel.forward.output_router_logits) so I was wonde... | 1,708 | 1,708 | null | NONE | null | ### System Info
- `transformers` version: 4.38.0.dev0
- Platform: Linux-5.15.0-1038-oracle-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.26.1
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.1+cu121 (True)
- T... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29087/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29087/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29086 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29086/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29086/comments | https://api.github.com/repos/huggingface/transformers/issues/29086/events | https://github.com/huggingface/transformers/pull/29086 | 2,141,128,404 | PR_kwDOCUB6oc5nOPwa | 29,086 | 🌐 [i18n-KO] Translated generation_strategies.md to Korean | {
"login": "AI4Harmony",
"id": 160417616,
"node_id": "U_kgDOCY_HUA",
"avatar_url": "https://avatars.githubusercontent.com/u/160417616?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AI4Harmony",
"html_url": "https://github.com/AI4Harmony",
"followers_url": "https://api.github.com/users/AI4... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29086). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
Translated the generation_strategies.md file of the documentation to Korean.
Thank you in advance for your review.
Part of https://github.com/huggingface/transformers/issues/20179
## Before submitting
- [x] Check for missing / redundant translations (번역 누락/중복 검사)
- [x] Grammar Chec... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29086/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29086/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29086",
"html_url": "https://github.com/huggingface/transformers/pull/29086",
"diff_url": "https://github.com/huggingface/transformers/pull/29086.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29086.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29085 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29085/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29085/comments | https://api.github.com/repos/huggingface/transformers/issues/29085/events | https://github.com/huggingface/transformers/pull/29085 | 2,141,125,268 | PR_kwDOCUB6oc5nOPKB | 29,085 | [WIP] Update legacy Repository usage in `examples/pytorch/text-classification/run_glue_no_trainer.py | {
"login": "Hvanderwilk",
"id": 15908112,
"node_id": "MDQ6VXNlcjE1OTA4MTEy",
"avatar_url": "https://avatars.githubusercontent.com/u/15908112?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hvanderwilk",
"html_url": "https://github.com/Hvanderwilk",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
The usage in the example is marked for deprecation here https://huggingface.co/docs/huggingface_hub/guides/upload#legacy-upload-files-with-git-lfs. Use the new recommended methods.
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is go... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29085/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29085/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29085",
"html_url": "https://github.com/huggingface/transformers/pull/29085",
"diff_url": "https://github.com/huggingface/transformers/pull/29085.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29085.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29084 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29084/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29084/comments | https://api.github.com/repos/huggingface/transformers/issues/29084/events | https://github.com/huggingface/transformers/pull/29084 | 2,141,100,964 | PR_kwDOCUB6oc5nOKKF | 29,084 | [Mistral, Mixtral] Improve docs | {
"login": "NielsRogge",
"id": 48327001,
"node_id": "MDQ6VXNlcjQ4MzI3MDAx",
"avatar_url": "https://avatars.githubusercontent.com/u/48327001?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/NielsRogge",
"html_url": "https://github.com/NielsRogge",
"followers_url": "https://api.github.com/use... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29084). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
This PR improves the docs of Mistral and Mixtral, by including:
- explaining the difference between base models vs instruction tuned ones, including the newer [v2 checkpoint](https://huggingface.co/mistralai/Mistral-7B-Instruct-v0.2) of Mistral-7B
- use of chat templates
- quantization t... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29084/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29084/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29084",
"html_url": "https://github.com/huggingface/transformers/pull/29084",
"diff_url": "https://github.com/huggingface/transformers/pull/29084.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29084.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29083 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29083/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29083/comments | https://api.github.com/repos/huggingface/transformers/issues/29083/events | https://github.com/huggingface/transformers/pull/29083 | 2,141,091,613 | PR_kwDOCUB6oc5nOIOL | 29,083 | Allow repo_id--module.classname config definition even if loading from path | {
"login": "rl337",
"id": 387895,
"node_id": "MDQ6VXNlcjM4Nzg5NQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/387895?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/rl337",
"html_url": "https://github.com/rl337",
"followers_url": "https://api.github.com/users/rl337/followers"... | [] | open | false | null | [] | [
"c @Rocketknight1 "
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | When you have a model that's in a path that's not exactly the repo_id relative to the current directory and the config has AutoConfig of the form model_id--module.classname in it, you can't load the model using the path to the model because resolving module.classname ends up being relative to repo_id as defined in the ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29083/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29083/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29083",
"html_url": "https://github.com/huggingface/transformers/pull/29083",
"diff_url": "https://github.com/huggingface/transformers/pull/29083.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29083.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29082 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29082/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29082/comments | https://api.github.com/repos/huggingface/transformers/issues/29082/events | https://github.com/huggingface/transformers/pull/29082 | 2,140,990,281 | PR_kwDOCUB6oc5nN0GE | 29,082 | FEAT [`Trainer` / `bnb`]: Add RMSProp from `bitsandbytes` to HF `Trainer` | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29082). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
As requested by the community, this PR adds the support for bnb RMSProp optimizers to HF Trainer !
`RMSProp` exists in bitsandbytes since its first commit: https://github.com/TimDettmers/bitsandbytes/commit/7439924891496025edf60c9da6a782f362a50c70#diff-8384af03566f84c3055f3fee7b1516696a1546... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29082/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 1,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29082/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29082",
"html_url": "https://github.com/huggingface/transformers/pull/29082",
"diff_url": "https://github.com/huggingface/transformers/pull/29082.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29082.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29081 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29081/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29081/comments | https://api.github.com/repos/huggingface/transformers/issues/29081/events | https://github.com/huggingface/transformers/pull/29081 | 2,140,736,966 | PR_kwDOCUB6oc5nM-Px | 29,081 | token healing impl | {
"login": "Ayenem",
"id": 50707385,
"node_id": "MDQ6VXNlcjUwNzA3Mzg1",
"avatar_url": "https://avatars.githubusercontent.com/u/50707385?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Ayenem",
"html_url": "https://github.com/Ayenem",
"followers_url": "https://api.github.com/users/Ayenem/fo... | [] | open | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29081). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"CI is failing due to an automatic update in the pytest package, we are tracking it... | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
Token healing rectifies the token boundary bias in greedy tokenization. It does this by trimming and regrowing the prompt to better align with the model's tokenizer, thus enhancing generation quality. The improvement is clearest with completion models.
Token boundary bias is a silent perfo... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29081/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 1,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29081/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29081",
"html_url": "https://github.com/huggingface/transformers/pull/29081",
"diff_url": "https://github.com/huggingface/transformers/pull/29081.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29081.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29080 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29080/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29080/comments | https://api.github.com/repos/huggingface/transformers/issues/29080/events | https://github.com/huggingface/transformers/issues/29080 | 2,140,668,237 | I_kwDOCUB6oc5_mAFN | 29,080 | repetition_penalty not being applied | {
"login": "adenhaus",
"id": 57678819,
"node_id": "MDQ6VXNlcjU3Njc4ODE5",
"avatar_url": "https://avatars.githubusercontent.com/u/57678819?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/adenhaus",
"html_url": "https://github.com/adenhaus",
"followers_url": "https://api.github.com/users/ade... | [] | closed | false | null | [] | [
"I just noticed the same issue. I think it would be a useful feature.",
"`repetition_penalty` is working on my end:\r\n\r\n```py\r\nfrom transformers import AutoModelForCausalLM, AutoTokenizer\r\n\r\ntokenizer = AutoTokenizer.from_pretrained(\"distilgpt2\")\r\nmodel = AutoModelForCausalLM.from_pretrained(\"distil... | 1,708 | 1,708 | 1,708 | NONE | null | ### System Info
- `transformers` version: 4.37.2
- Platform: Linux-6.1.58+-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: 0.27.2
- Accelerate config: not found
- PyTorch version (GPU?): 2.1.0+cu121 (False)
- Tensorflow versi... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29080/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29080/timeline | not_planned | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29079 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29079/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29079/comments | https://api.github.com/repos/huggingface/transformers/issues/29079/events | https://github.com/huggingface/transformers/pull/29079 | 2,140,536,642 | PR_kwDOCUB6oc5nMT9Q | 29,079 | Quantization support for CUDA graph generation. | {
"login": "BlackSamorez",
"id": 16901341,
"node_id": "MDQ6VXNlcjE2OTAxMzQx",
"avatar_url": "https://avatars.githubusercontent.com/u/16901341?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/BlackSamorez",
"html_url": "https://github.com/BlackSamorez",
"followers_url": "https://api.github.c... | [] | open | false | null | [] | [
"cc @younesbelkada "
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29079/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29079/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29079",
"html_url": "https://github.com/huggingface/transformers/pull/29079",
"diff_url": "https://github.com/huggingface/transformers/pull/29079.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29079.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29078 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29078/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29078/comments | https://api.github.com/repos/huggingface/transformers/issues/29078/events | https://github.com/huggingface/transformers/issues/29078 | 2,140,402,242 | I_kwDOCUB6oc5_k_JC | 29,078 | Error while using load_best_model_at_end with LoRA adapters inside Trainer and SFTTrainer | {
"login": "deshwalmahesh",
"id": 50293852,
"node_id": "MDQ6VXNlcjUwMjkzODUy",
"avatar_url": "https://avatars.githubusercontent.com/u/50293852?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/deshwalmahesh",
"html_url": "https://github.com/deshwalmahesh",
"followers_url": "https://api.githu... | [] | open | false | null | [] | [
"cc @younesbelkada ",
"Hi @deshwalmahesh \r\nThanks for the issue! \r\nJust to confirm, what PEFT version do you have ? Can oyou try out with the latest PEFT ? `pip install -U peft`"
] | 1,708 | 1,708 | null | NONE | null | ### System Info
I'm using version `4.37` and when you use any of the model like `AutoModelForCausalLM, AutoModelForSequenceClassification` along with LoRA adapters, you get error after training gets finished. I used `load_best_model_at_end` with `TrainingArguments` on both the above mentioned models with `Trainer` a... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29078/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29078/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29077 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29077/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29077/comments | https://api.github.com/repos/huggingface/transformers/issues/29077/events | https://github.com/huggingface/transformers/pull/29077 | 2,139,899,107 | PR_kwDOCUB6oc5nKCnG | 29,077 | New model support RTDETR | {
"login": "SangbumChoi",
"id": 34004152,
"node_id": "MDQ6VXNlcjM0MDA0MTUy",
"avatar_url": "https://avatars.githubusercontent.com/u/34004152?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SangbumChoi",
"html_url": "https://github.com/SangbumChoi",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"Looking good @SangbumChoi! Let us know when the PR is ready for review 🤗 "
] | 1,708 | 1,708 | null | CONTRIBUTOR | null | # What does this PR do?
This is the new model for RTDETR that is complete version from https://github.com/huggingface/transformers/pull/27247
There are several TO DOs
- [X] reslove conflicts
- [ ] weight files for other 7 RTDETR
- [ ] Edit testing script
- [ ] (optional) enable training
## Before submitt... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29077/reactions",
"total_count": 1,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 1,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29077/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29077",
"html_url": "https://github.com/huggingface/transformers/pull/29077",
"diff_url": "https://github.com/huggingface/transformers/pull/29077.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29077.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29076 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29076/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29076/comments | https://api.github.com/repos/huggingface/transformers/issues/29076/events | https://github.com/huggingface/transformers/issues/29076 | 2,139,855,433 | I_kwDOCUB6oc5_i5pJ | 29,076 | RingAttention Support | {
"login": "Hambaobao",
"id": 48345096,
"node_id": "MDQ6VXNlcjQ4MzQ1MDk2",
"avatar_url": "https://avatars.githubusercontent.com/u/48345096?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Hambaobao",
"html_url": "https://github.com/Hambaobao",
"followers_url": "https://api.github.com/users/... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | [] | 1,708 | 1,708 | null | NONE | null | ### Feature request
Hello,
I would like to inquire about the potential inclusion of [RingAttention](https://github.com/lhao499/ring-attention) in `Transformers`, which could enable training with longer sequences.
### Motivation
The incorporation of `RingAttention` would significantly enhance the capabilities for ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29076/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29076/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29075 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29075/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29075/comments | https://api.github.com/repos/huggingface/transformers/issues/29075/events | https://github.com/huggingface/transformers/issues/29075 | 2,139,821,427 | I_kwDOCUB6oc5_ixVz | 29,075 | Inputs left-padded passed to Instruct-Mistral-7B, with FlashAttention-2, causes garbage outputs for the padded sequences | {
"login": "millicentli",
"id": 20379204,
"node_id": "MDQ6VXNlcjIwMzc5MjA0",
"avatar_url": "https://avatars.githubusercontent.com/u/20379204?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/millicentli",
"html_url": "https://github.com/millicentli",
"followers_url": "https://api.github.com/... | [] | open | false | null | [] | [
"Update: so downgrading to `transformers`: 4.34.0 fixed this issue. See: https://huggingface.co/proxy/discuss.huggingface.co/t/fine-tuned-mistral-7b-inference-issue-for-4k-context-length-token-with-transformer-4-35/65295\r\n\r\nThis is still a problem though with the `transformers` version noted though, so would like a fix if possible ... | 1,708 | 1,708 | null | NONE | null | ### System Info
transformers version: 4.36.2
Pytorch version: 2.2.0
Platform: Rocky Linux release 8.8 (Green Obsidian), 4.18.0-477.27.1.el8_8.x86_64
Python version: Python 3.9.18
Accelerate version: 0.26.1
FlashAttention-2 version: 2.5.3
### Who can help?
@ArthurZucker, @younesbelkada
### Information
... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29075/reactions",
"total_count": 1,
"+1": 1,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29075/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29074 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29074/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29074/comments | https://api.github.com/repos/huggingface/transformers/issues/29074/events | https://github.com/huggingface/transformers/issues/29074 | 2,139,812,888 | I_kwDOCUB6oc5_ivQY | 29,074 | NotImplementedError | {
"login": "vincent507cpu",
"id": 29680509,
"node_id": "MDQ6VXNlcjI5NjgwNTA5",
"avatar_url": "https://avatars.githubusercontent.com/u/29680509?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/vincent507cpu",
"html_url": "https://github.com/vincent507cpu",
"followers_url": "https://api.githu... | [] | closed | false | null | [] | [] | 1,708 | 1,708 | 1,708 | NONE | null | ### System Info
I was trying to run **https://github.com/dvlab-research/LongLoRA/blob/main/fine-tune.py**, but I first encountered `ValueError: Tokenizer class ChatGLMTokenizer does not exist or is not currently imported.` (it's been solved by switching line 135 and line 141 snippests). Then I had a new error (please ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29074/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29074/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29073 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29073/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29073/comments | https://api.github.com/repos/huggingface/transformers/issues/29073/events | https://github.com/huggingface/transformers/pull/29073 | 2,139,700,801 | PR_kwDOCUB6oc5nJX8g | 29,073 | Bump cryptography from 42.0.0 to 42.0.2 in /examples/research_projects/decision_transformer | {
"login": "dependabot[bot]",
"id": 49699333,
"node_id": "MDM6Qm90NDk2OTkzMzM=",
"avatar_url": "https://avatars.githubusercontent.com/in/29110?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/dependabot%5Bbot%5D",
"html_url": "https://github.com/apps/dependabot",
"followers_url": "https://a... | [
{
"id": 1905493434,
"node_id": "MDU6TGFiZWwxOTA1NDkzNDM0",
"url": "https://api.github.com/repos/huggingface/transformers/labels/dependencies",
"name": "dependencies",
"color": "0366d6",
"default": false,
"description": "Pull requests that update a dependency file"
},
{
"id": 6410... | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29073). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"OK, I won't notify you again about this release, but will get in touch when a new ... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Bumps [cryptography](https://github.com/pyca/cryptography) from 42.0.0 to 42.0.2.
<details>
<summary>Changelog</summary>
<p><em>Sourced from <a href="https://github.com/pyca/cryptography/blob/main/CHANGELOG.rst">cryptography's changelog</a>.</em></p>
<blockquote>
<p>42.0.2 - 2024-01-30</p>
<pre><code>
* Updated Windows... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29073/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29073/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29073",
"html_url": "https://github.com/huggingface/transformers/pull/29073",
"diff_url": "https://github.com/huggingface/transformers/pull/29073.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29073.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29072 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29072/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29072/comments | https://api.github.com/repos/huggingface/transformers/issues/29072/events | https://github.com/huggingface/transformers/pull/29072 | 2,139,666,847 | PR_kwDOCUB6oc5nJQGo | 29,072 | Fix a typo in `examples/pytorch/text-classification/run_classification.py` | {
"login": "Ja1Zhou",
"id": 50169346,
"node_id": "MDQ6VXNlcjUwMTY5MzQ2",
"avatar_url": "https://avatars.githubusercontent.com/u/50169346?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Ja1Zhou",
"html_url": "https://github.com/Ja1Zhou",
"followers_url": "https://api.github.com/users/Ja1Zho... | [] | closed | false | null | [] | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29072). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Hi there, this commit fixes a tiny typo in the provided pytorch text classification pipeline. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29072/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29072/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29072",
"html_url": "https://github.com/huggingface/transformers/pull/29072",
"diff_url": "https://github.com/huggingface/transformers/pull/29072.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29072.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29071 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29071/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29071/comments | https://api.github.com/repos/huggingface/transformers/issues/29071/events | https://github.com/huggingface/transformers/pull/29071 | 2,139,637,479 | PR_kwDOCUB6oc5nJJty | 29,071 | Typo in `modeling_clip.ClipVisionTransformer` | {
"login": "AdityaKane2001",
"id": 64411306,
"node_id": "MDQ6VXNlcjY0NDExMzA2",
"avatar_url": "https://avatars.githubusercontent.com/u/64411306?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/AdityaKane2001",
"html_url": "https://github.com/AdityaKane2001",
"followers_url": "https://api.gi... | [] | open | false | null | [] | [
"@amyeroberts \r\n\r\nYes, I realized that when I tried to incorporate the change in my fork. Maybe the maintainers might have to do this, but what would be a solution in this case? Apart from the brute-force one, i.e. changing names in _all_ hosted clip weights.\r\n\r\n",
"Maybe overloading a try-catch mechanism... | 1,708 | 1,708 | null | CONTRIBUTOR | null | Fixed a typo in `modeling_clip.py`. | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29071/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29071/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29071",
"html_url": "https://github.com/huggingface/transformers/pull/29071",
"diff_url": "https://github.com/huggingface/transformers/pull/29071.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29071.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29070 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29070/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29070/comments | https://api.github.com/repos/huggingface/transformers/issues/29070/events | https://github.com/huggingface/transformers/pull/29070 | 2,139,577,470 | PR_kwDOCUB6oc5nI8e4 | 29,070 | Add support for fine-tuning CLIP-like models using contrastive-image-text example | {
"login": "tjs-intel",
"id": 74561858,
"node_id": "MDQ6VXNlcjc0NTYxODU4",
"avatar_url": "https://avatars.githubusercontent.com/u/74561858?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/tjs-intel",
"html_url": "https://github.com/tjs-intel",
"followers_url": "https://api.github.com/users/... | [] | closed | false | null | [] | [
"Fixing up this PR as per the contributor guidelines now",
"Happy to receive suggestions for any test candidates",
"This has been manually tested by replacing `openai/clip-vit-base-patch32` in the contrastive-image-text example with the following models:\r\n\r\n```\r\n\tOFA-Sys/chinese-clip-vit-base-patch16\r\n... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
The example [contrastive-image-text](https://github.com/huggingface/transformers/blob/f497f56/examples/pytorch/contrastive-image-text/README.md) works for fine-tuning models that have the `model_type` "clip", but for other models like "chinese_clip" and "siglip" the `VisionTextDualEncoderConf... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29070/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29070/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29070",
"html_url": "https://github.com/huggingface/transformers/pull/29070",
"diff_url": "https://github.com/huggingface/transformers/pull/29070.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29070.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29069 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29069/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29069/comments | https://api.github.com/repos/huggingface/transformers/issues/29069/events | https://github.com/huggingface/transformers/issues/29069 | 2,139,529,715 | I_kwDOCUB6oc5_hqHz | 29,069 | is_vision_availble() is slow and called a lot | {
"login": "collosi",
"id": 138069,
"node_id": "MDQ6VXNlcjEzODA2OQ==",
"avatar_url": "https://avatars.githubusercontent.com/u/138069?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/collosi",
"html_url": "https://github.com/collosi",
"followers_url": "https://api.github.com/users/collosi/fo... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
},
{
"id": 5769473378,
... | open | false | null | [] | [
"@collosi can you share a minimal code snippet to reproduce your issue?"
] | 1,708 | 1,708 | null | NONE | null | ### Feature request
Memoize the result of `is_vision_availble()`, because it depends on installed packages which are unlikely to change.
### Motivation
`is_vision_availble()` in import_utils.py is slow, in that it results in many calls to os.stat and os.list_dir, because it is checking import metadata. Because it i... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29069/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29069/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29068 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29068/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29068/comments | https://api.github.com/repos/huggingface/transformers/issues/29068/events | https://github.com/huggingface/transformers/issues/29068 | 2,139,498,780 | I_kwDOCUB6oc5_hikc | 29,068 | Add Support for Dataclasses to Trainer | {
"login": "ntenenz",
"id": 8411908,
"node_id": "MDQ6VXNlcjg0MTE5MDg=",
"avatar_url": "https://avatars.githubusercontent.com/u/8411908?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ntenenz",
"html_url": "https://github.com/ntenenz",
"followers_url": "https://api.github.com/users/ntenenz/... | [
{
"id": 2155169140,
"node_id": "MDU6TGFiZWwyMTU1MTY5MTQw",
"url": "https://api.github.com/repos/huggingface/transformers/labels/trainer",
"name": "trainer",
"color": "2ef289",
"default": false,
"description": ""
},
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
... | open | false | null | [] | [
"cc @muellerzr "
] | 1,708 | 1,708 | null | NONE | null | ### Feature request
Update Trainer._prepare_input to natively support python dataclasses for support of more structured objects.
### Motivation
_prepare_input will seamlessly transfer the tensors contained in many datatypes (list, tuple, dict, etc.) to the appropriate device. However, it will not do so for dataclass... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29068/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29068/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29067 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29067/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29067/comments | https://api.github.com/repos/huggingface/transformers/issues/29067/events | https://github.com/huggingface/transformers/issues/29067 | 2,139,183,717 | I_kwDOCUB6oc5_gVpl | 29,067 | FalconAttention Doesn't use `alibi` when `alibi` is not None if `_use_sdpa==True` | {
"login": "SamanehSaadat",
"id": 1986164,
"node_id": "MDQ6VXNlcjE5ODYxNjQ=",
"avatar_url": "https://avatars.githubusercontent.com/u/1986164?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/SamanehSaadat",
"html_url": "https://github.com/SamanehSaadat",
"followers_url": "https://api.github.... | [] | closed | false | null | [] | [
"HEy! Pretty sure that is because it's not supported in `sdpa` nor is `flash_attention`. We should / could raise an error however. ",
"Hi @SamanehSaadat thank you for the notice, please refer to: https://github.com/huggingface/transformers/blob/2f1003be86f11c8d97d7c2e6a7739dbb6fa795f2/src/transformers/models/falc... | 1,708 | 1,708 | 1,708 | NONE | null | ### System Info
- `transformers` version: 4.37.0.dev0
- Platform: Linux-6.5.13-1rodete2-amd64-x86_64
- Python version: 3.10.13
- Huggingface_hub version: 0.20.3
- Safetensors version: 0.4.2
- Accelerate version: not installed
- Accelerate config: not found
- PyTorch version (GPU?): 2.2.0+cu121 (False)
- Tensor... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29067/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29067/timeline | completed | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29066 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29066/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29066/comments | https://api.github.com/repos/huggingface/transformers/issues/29066/events | https://github.com/huggingface/transformers/pull/29066 | 2,139,025,735 | PR_kwDOCUB6oc5nHBPp | 29,066 | Bnb test fix for different hardwares | {
"login": "Titus-von-Koeller",
"id": 9048635,
"node_id": "MDQ6VXNlcjkwNDg2MzU=",
"avatar_url": "https://avatars.githubusercontent.com/u/9048635?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/Titus-von-Koeller",
"html_url": "https://github.com/Titus-von-Koeller",
"followers_url": "https:/... | [] | closed | false | {
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url": "https://api.githu... | [
{
"login": "younesbelkada",
"id": 49240599,
"node_id": "MDQ6VXNlcjQ5MjQwNTk5",
"avatar_url": "https://avatars.githubusercontent.com/u/49240599?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/younesbelkada",
"html_url": "https://github.com/younesbelkada",
"followers_url"... | [
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29066). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Could one of you please merge? I don't have permission to do so."
] | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | Just updating the acceptable generated text, as this is slightly different based on hardware from what I can tell. The first commit is based on what I observed on my dev VM with A10G and the second commit is based on what I saw [in the failing BNB integration pipeline](https://github.com/huggingface/peft/actions/runs/7... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29066/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29066/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29066",
"html_url": "https://github.com/huggingface/transformers/pull/29066",
"diff_url": "https://github.com/huggingface/transformers/pull/29066.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29066.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29065 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29065/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29065/comments | https://api.github.com/repos/huggingface/transformers/issues/29065/events | https://github.com/huggingface/transformers/pull/29065 | 2,139,009,551 | PR_kwDOCUB6oc5nG9pQ | 29,065 | Fix WhisperNoSpeechDetection when input is full silence | {
"login": "ylacombe",
"id": 52246514,
"node_id": "MDQ6VXNlcjUyMjQ2NTE0",
"avatar_url": "https://avatars.githubusercontent.com/u/52246514?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/ylacombe",
"html_url": "https://github.com/ylacombe",
"followers_url": "https://api.github.com/users/yla... | [] | open | false | null | [] | [
"I want to add a test, but I realized most of the slow tests of Whisper were already failing, independently of this PR. \r\ncc @sanchit-gandhi ",
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29065). All of your documentation changes will be reflected on that endpoint.... | 1,708 | 1,708 | null | COLLABORATOR | null | # What does this PR do?
@cifkao found an edge case that happens when the input of Whisper.generate is a full silence. This is a simple tentative PR.
cc @sanchit-gandhi
<!-- Remove if not applicable -->
Fixes #29036
| {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29065/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29065/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29065",
"html_url": "https://github.com/huggingface/transformers/pull/29065",
"diff_url": "https://github.com/huggingface/transformers/pull/29065.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29065.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29064 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29064/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29064/comments | https://api.github.com/repos/huggingface/transformers/issues/29064/events | https://github.com/huggingface/transformers/issues/29064 | 2,138,935,677 | I_kwDOCUB6oc5_fZF9 | 29,064 | Swapping `tqdm` to `rich` | {
"login": "alexge233",
"id": 6159747,
"node_id": "MDQ6VXNlcjYxNTk3NDc=",
"avatar_url": "https://avatars.githubusercontent.com/u/6159747?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/alexge233",
"html_url": "https://github.com/alexge233",
"followers_url": "https://api.github.com/users/al... | [
{
"id": 2648621985,
"node_id": "MDU6TGFiZWwyNjQ4NjIxOTg1",
"url": "https://api.github.com/repos/huggingface/transformers/labels/Feature%20request",
"name": "Feature request",
"color": "FBCA04",
"default": false,
"description": "Request for a new feature"
}
] | open | false | null | [] | [
"> I can see on github it's at `transformers/src/transformers/tokenization_utils_fast.py` and I can see in lines #790 and #791 that there's a further method `train_from_iterator` but at this point I can't find where the actual code is? Can anyone point me to the right direction?\r\n\r\n`train_from_iterator` is defi... | 1,708 | 1,708 | null | NONE | null | ### Feature request
Hi, for `AutoTokenizer.train_new_from_iterator` there's a hardcoded `tqdm` progress bar I want to swap to `rich` and I'm happy to PR it back.
I can see on github it's at `transformers/src/transformers/tokenization_utils_fast.py` and I can see in lines #790 and #791 that there's a further method ... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29064/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29064/timeline | null | null | null |
https://api.github.com/repos/huggingface/transformers/issues/29063 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29063/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29063/comments | https://api.github.com/repos/huggingface/transformers/issues/29063/events | https://github.com/huggingface/transformers/pull/29063 | 2,138,892,403 | PR_kwDOCUB6oc5nGj0m | 29,063 | Raise unused kwargs image processor | {
"login": "molbap",
"id": 39954772,
"node_id": "MDQ6VXNlcjM5OTU0Nzcy",
"avatar_url": "https://avatars.githubusercontent.com/u/39954772?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/molbap",
"html_url": "https://github.com/molbap",
"followers_url": "https://api.github.com/users/molbap/fo... | [] | closed | false | null | [] | [
"Finishing up draft, removing this validation functionality from #28711 ",
"The docs for this PR live [here](https://huggingface.co/proxy/moon-ci-docs.huggingface.co/docs/transformers/pr_29063). All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.",
"Main ... | 1,708 | 1,708 | 1,708 | CONTRIBUTOR | null | # What does this PR do?
This PR captures all kwargs passed to an `ImageProcessor` `preprocess` method and compares them to what's expected, raising an exception or logging an informative message when a difference is found.
This will
1) Make preprocess methods more reliable
2) Inform users when an expected kwargs... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29063/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29063/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29063",
"html_url": "https://github.com/huggingface/transformers/pull/29063",
"diff_url": "https://github.com/huggingface/transformers/pull/29063.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29063.patch",
"merged_at... |
https://api.github.com/repos/huggingface/transformers/issues/29062 | https://api.github.com/repos/huggingface/transformers | https://api.github.com/repos/huggingface/transformers/issues/29062/labels{/name} | https://api.github.com/repos/huggingface/transformers/issues/29062/comments | https://api.github.com/repos/huggingface/transformers/issues/29062/events | https://github.com/huggingface/transformers/pull/29062 | 2,138,763,388 | PR_kwDOCUB6oc5nGHd6 | 29,062 | [WIP] Add FLMR model | {
"login": "LinWeizheDragon",
"id": 33350454,
"node_id": "MDQ6VXNlcjMzMzUwNDU0",
"avatar_url": "https://avatars.githubusercontent.com/u/33350454?v=4",
"gravatar_id": "",
"url": "https://api.github.com/users/LinWeizheDragon",
"html_url": "https://github.com/LinWeizheDragon",
"followers_url": "https://api... | [] | open | false | null | [] | [
"@ArthurZucker\r\n@younesbelkada\r\n@amyeroberts\r\nCan anyone help me to finish this PR or assign someone knowledgeable? The whole process is a little complicated and I have no idea what to do next. Thanks!",
"Hello @LinWeizheDragon, could you update the readme to have links to the pretrained checkpoints, the or... | 1,708 | 1,708 | null | NONE | null | # What does this PR do?
<!--
Congratulations! You've made it this far! You're not quite done yet though.
Once merged, your PR is going to appear in the release notes with the title you set, so make sure it's a great title that fully reflects the extent of your awesome contribution.
Then, please replace this w... | {
"url": "https://api.github.com/repos/huggingface/transformers/issues/29062/reactions",
"total_count": 0,
"+1": 0,
"-1": 0,
"laugh": 0,
"hooray": 0,
"confused": 0,
"heart": 0,
"rocket": 0,
"eyes": 0
} | https://api.github.com/repos/huggingface/transformers/issues/29062/timeline | null | false | {
"url": "https://api.github.com/repos/huggingface/transformers/pulls/29062",
"html_url": "https://github.com/huggingface/transformers/pull/29062",
"diff_url": "https://github.com/huggingface/transformers/pull/29062.diff",
"patch_url": "https://github.com/huggingface/transformers/pull/29062.patch",
"merged_at... |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.