repo_name stringclasses 6
values | pr_number int64 99 20.3k | pr_title stringlengths 8 158 | pr_description stringlengths 0 6.54k | author stringlengths 4 18 | date_created timestamp[ns, tz=UTC] | date_merged timestamp[ns, tz=UTC] | previous_commit stringlengths 40 40 | pr_commit stringlengths 40 40 | query stringlengths 37 6.57k | filepath stringlengths 8 153 | before_content stringlengths 0 876M | after_content stringlengths 0 876M | label int64 -1 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/data2vec/configuration_data2vec_text.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/swin/convert_swin_simmim_to_pytorch.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/ctrl/modeling_tf_ctrl.py | # coding=utf-8
# Copyright 2018 Salesforce and HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# h... | # coding=utf-8
# Copyright 2018 Salesforce and HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# h... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/vit_msn/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/table_transformer/modeling_table_transformer.py | # coding=utf-8
# Copyright 2022 Microsoft Research and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2022 Microsoft Research and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/codegen/tokenization_codegen.py | # coding=utf-8
# Copyright 2022 The Salesforce authors, The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/l... | # coding=utf-8
# Copyright 2022 The Salesforce authors, The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/l... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/vit_mae/modeling_tf_vit_mae.py | # coding=utf-8
# Copyright 2022 Facebook AI and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | # coding=utf-8
# Copyright 2022 Facebook AI and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./docs/source/en/model_doc/perceiver.mdx | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./docs/source/en/installation.mdx | <!---
Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./docs/source/en/model_doc/wav2vec2_phoneme.mdx | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/pipelines/text_generation.py | import enum
import warnings
from transformers import MODEL_FOR_CAUSAL_LM_MAPPING, TF_MODEL_FOR_CAUSAL_LM_MAPPING
from ..utils import add_end_docstrings, is_tf_available
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
class ReturnType(enum.Enum):
TENSORS = 0
... | import enum
import warnings
from transformers import MODEL_FOR_CAUSAL_LM_MAPPING, TF_MODEL_FOR_CAUSAL_LM_MAPPING
from ..utils import add_end_docstrings, is_tf_available
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
class ReturnType(enum.Enum):
TENSORS = 0
... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/pipelines/question_answering.py | import types
import warnings
from collections.abc import Iterable
from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Union
import numpy as np
from ..data import SquadExample, SquadFeatures, squad_convert_examples_to_features
from ..modelcard import ModelCard
from ..tokenization_utils import PreTrainedToke... | import types
import warnings
from collections.abc import Iterable
from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Union
import numpy as np
from ..data import SquadExample, SquadFeatures, squad_convert_examples_to_features
from ..modelcard import ModelCard
from ..tokenization_utils import PreTrainedToke... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/led/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./examples/research_projects/lxmert/demo.ipynb | {
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"# %pip install-r requirements.txt"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
... | {
"cells": [
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [],
"source": [
"# %pip install-r requirements.txt"
]
},
{
"cell_type": "code",
"execution_count": 1,
"metadata": {},
"outputs": [
{
"name": "stderr",
"output_type": "stream",
... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./examples/research_projects/distillation/training_configs/distilgpt2.json | {
"initializer_range": 0.02,
"layer_norm_epsilon": 0.00001,
"n_embd": 768,
"n_head": 12,
"n_layer": 6,
"n_positions": 1024,
"vocab_size": 50257
} | {
"initializer_range": 0.02,
"layer_norm_epsilon": 0.00001,
"n_embd": 768,
"n_head": 12,
"n_layer": 6,
"n_positions": 1024,
"vocab_size": 50257
} | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./docs/source/es/_config.py | # docstyle-ignore
INSTALL_CONTENT = """
# Transformers installation
! pip install transformers datasets
# To install from source instead of the last release, comment the command above and uncomment the following one.
# ! pip install git+https://github.com/huggingface/transformers.git
"""
notebook_first_cells = [{"type... | # docstyle-ignore
INSTALL_CONTENT = """
# Transformers installation
! pip install transformers datasets
# To install from source instead of the last release, comment the command above and uncomment the following one.
# ! pip install git+https://github.com/huggingface/transformers.git
"""
notebook_first_cells = [{"type... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./examples/tensorflow/summarization/README.md | <!---
Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2021 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/pipelines/visual_question_answering.py | from typing import Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_available():
from ..models.auto.modeling_auto... | from typing import Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_available():
from ..models.auto.modeling_auto... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/commands/user.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./examples/research_projects/visual_bert/modeling_frcnn.py | """
coding=utf-8
Copyright 2018, Antonio Mendoza Hao Tan, Mohit Bansal
Adapted From Facebook Inc, Detectron2 && Huggingface Co.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www... | """
coding=utf-8
Copyright 2018, Antonio Mendoza Hao Tan, Mohit Bansal
Adapted From Facebook Inc, Detectron2 && Huggingface Co.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./tests/models/wav2vec2_with_lm/__init__.py | -1 | ||
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./docs/source/en/task_summary.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/funnel/tokenization_funnel.py | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/pipelines/text2text_generation.py | import enum
import warnings
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
from ..models.auto.modeling_tf_auto import TF_MODE... | import enum
import warnings
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
from ..models.auto.modeling_tf_auto import TF_MODE... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./docs/source/en/main_classes/image_processor.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/wav2vec2_conformer/convert_wav2vec2_conformer_original_pytorch_checkpoint_to_pytorch.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./tests/models/bert/test_modeling_tf_bert.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | # coding=utf-8
# Copyright 2021 The Fairseq Authors and the HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/... | # coding=utf-8
# Copyright 2021 The Fairseq Authors and the HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/electra/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/canine/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/gptj/modeling_tf_gptj.py | # coding=utf-8
# Copyright 2022 The EleutherAI and HuggingFace Teams. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#... | # coding=utf-8
# Copyright 2022 The EleutherAI and HuggingFace Teams. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./tests/utils/test_generic.py | # coding=utf-8
# Copyright 2019-present, the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by a... | # coding=utf-8
# Copyright 2019-present, the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by a... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/pegasus_x/configuration_pegasus_x.py | # coding=utf-8
# Copyright 2022, Google and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | # coding=utf-8
# Copyright 2022, Google and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | -1 |
huggingface/transformers | 20,217 | remaining pytorch type hints | # What does this PR do?
Type hints
@Rocketknight1 | IMvision12 | 2022-11-14T18:43:23Z | 2022-11-16T16:53:40Z | 9ea1dbd2bed21a50cdc52e9e41a906d2ae155a66 | d4d23141c42898a2d3eb4c39baa9b63b72093fd9 | remaining pytorch type hints. # What does this PR do?
Type hints
@Rocketknight1 | ./src/transformers/models/mbart/modeling_flax_mbart.py | # coding=utf-8
# Copyright 2021, The Facebook AI Research Team and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.or... | # coding=utf-8
# Copyright 2021, The Facebook AI Research Team and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.or... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/trainer.py | # coding=utf-8
# Copyright 2020-present the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by ap... | # coding=utf-8
# Copyright 2020-present the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by ap... | 1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/utils/__init__.py | #!/usr/bin/env python
# coding=utf-8
# flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version... | #!/usr/bin/env python
# coding=utf-8
# flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version... | 1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/utils/generic.py | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/luke/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./utils/check_table.py | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/bigbird_pegasus/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/xlm_roberta/test_modeling_flax_xlm_roberta.py | # coding=utf-8
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/squeezebert/configuration_squeezebert.py | # coding=utf-8
# Copyright 2020 The SqueezeBert authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# U... | # coding=utf-8
# Copyright 2020 The SqueezeBert authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# U... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/pegasus_x/__init__.py | -1 | ||
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/vit/modeling_flax_vit.py | # coding=utf-8
# Copyright 2021 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | # coding=utf-8
# Copyright 2021 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/bert_japanese/test_tokenization_bert_japanese.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/speech_to_text/__init__.py | -1 | ||
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/onnx/features.py | import os
from functools import partial, reduce
from typing import TYPE_CHECKING, Callable, Dict, Optional, Tuple, Type, Union
import transformers
from .. import PretrainedConfig, is_tf_available, is_torch_available
from ..utils import TF2_WEIGHTS_NAME, WEIGHTS_NAME, logging
from .config import OnnxConfig
if TYPE_C... | import os
from functools import partial, reduce
from typing import TYPE_CHECKING, Callable, Dict, Optional, Tuple, Type, Union
import transformers
from .. import PretrainedConfig, is_tf_available, is_torch_available
from ..utils import TF2_WEIGHTS_NAME, WEIGHTS_NAME, logging
from .config import OnnxConfig
if TYPE_C... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/herbert/tokenization_herbert.py | # coding=utf-8
# Copyright 2020 The Google AI Language Team Authors, Allegro.pl, Facebook Inc. and the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://ww... | # coding=utf-8
# Copyright 2020 The Google AI Language Team Authors, Allegro.pl, Facebook Inc. and the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://ww... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/visual_bert/__init__.py | -1 | ||
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/camembert/test_tokenization_camembert.py | # coding=utf-8
# Copyright 2018 HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law... | # coding=utf-8
# Copyright 2018 HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/albert/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./examples/pytorch/question-answering/trainer_qa.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | # coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/auto/test_processor_auto.py | # coding=utf-8
# Copyright 2021 the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/utils/model_parallel_utils.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/layoutlm/modeling_layoutlm.py | # coding=utf-8
# Copyright 2018 The Microsoft Research Asia LayoutLM Team Authors and the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/... | # coding=utf-8
# Copyright 2018 The Microsoft Research Asia LayoutLM Team Authors and the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/levit/convert_levit_timm_to_pytorch.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/mobilebert/test_tokenization_mobilebert.py | # coding=utf-8
# Copyright 2022 Leon Derczynski. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by... | # coding=utf-8
# Copyright 2022 Leon Derczynski. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./examples/legacy/seq2seq/test_data/wmt_en_ro/test.source | UN Chief Says There Is No Military Solution in Syria Secretary-General Ban Ki-moon says his response to Russia's stepped up military support for Syria is that "there is no military solution" to the nearly five-year conflict and more weapons will only worsen the violence and misery for millions of people. The U.N. chief... | UN Chief Says There Is No Military Solution in Syria Secretary-General Ban Ki-moon says his response to Russia's stepped up military support for Syria is that "there is no military solution" to the nearly five-year conflict and more weapons will only worsen the violence and misery for millions of people. The U.N. chief... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | # coding=utf-8
# Copyright 2019 Facebook AI Research and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the Licens... | # coding=utf-8
# Copyright 2019 Facebook AI Research and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the Licens... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/ctrl/tokenization_ctrl.py | # coding=utf-8
# Copyright 2018 Salesforce and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | # coding=utf-8
# Copyright 2018 Salesforce and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./tests/models/mobilebert/__init__.py | -1 | ||
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/openai/tokenization_openai_fast.py | # coding=utf-8
# Copyright 2018 The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# ... | # coding=utf-8
# Copyright 2018 The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# ... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./docs/source/en/internal/trainer_utils.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/data2vec/modeling_data2vec_vision.py | # coding=utf-8
# Copyright 2022 Meta Platforms and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICEN... | # coding=utf-8
# Copyright 2022 Meta Platforms and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICEN... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/mobilenet_v2/image_processing_mobilenet_v2.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,214 | Allow trainer to return eval. loss for CLIP-like models | # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = False` for CLIP-like models, and can't give loss value in t... | ydshieh | 2022-11-14T16:53:51Z | 2022-11-15T18:47:10Z | 822ae69c1b1c486b6ed277964906e273888221a3 | 0d0d77693f79c7f7d39bba6921cc9741f00de988 | Allow trainer to return eval. loss for CLIP-like models. # What does this PR do?
Allow trainer to give **evaluation** loss for CLIP-like models.
Currently, this line
https://github.com/huggingface/transformers/blob/07d8d6e2f7a920d399e5e86a82d78179cdfa6746/src/transformers/trainer.py#L3192
gives `has_labels = Fa... | ./src/transformers/models/bert/convert_bert_pytorch_checkpoint_to_original_tf.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/generation/utils.py | # coding=utf-8
# Copyright 2020 The Google AI Language Team Authors, Facebook AI Research authors and The HuggingFace Inc. team.
# Copyright (c) 2020, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the L... | # coding=utf-8
# Copyright 2020 The Google AI Language Team Authors, Facebook AI Research authors and The HuggingFace Inc. team.
# Copyright (c) 2020, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the L... | 1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/bloom/modeling_bloom.py | # coding=utf-8
# Copyright 2022 HuggingFace Inc. team and BigScience workshop.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless re... | # coding=utf-8
# Copyright 2022 HuggingFace Inc. team and BigScience workshop.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless re... | 1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./tests/generation/test_utils.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a clone of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Team Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a clone of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | 1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/codegen/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 Salesforce authors, The EleutherAI, and HuggingFace Teams. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 Salesforce authors, The EleutherAI, and HuggingFace Teams. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/squeezebert/tokenization_squeezebert.py | # coding=utf-8
# Copyright 2020 The SqueezeBert authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# U... | # coding=utf-8
# Copyright 2020 The SqueezeBert authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# U... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./tests/models/layoutlmv2/test_processor_layoutlmv2.py | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/whisper/modeling_whisper.py | # coding=utf-8
# Copyright 2022 The OpenAI Authors and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2022 The OpenAI Authors and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/decision_transformer/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./examples/pytorch/semantic-segmentation/run_semantic_segmentation.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LI... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LI... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/bert/convert_bert_pytorch_checkpoint_to_original_tf.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/segformer/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/cpm/tokenization_cpm_fast.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICEN... | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICEN... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./examples/pytorch/question-answering/run_qa_beam_search.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/conditional_detr/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/gpt2/modeling_tf_gpt2.py | # coding=utf-8
# Copyright 2018 The OpenAI Team Authors and HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License... | # coding=utf-8
# Copyright 2018 The OpenAI Team Authors and HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/wav2vec2_with_lm/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/vilt/feature_extraction_vilt.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./tests/generation/test_logits_process.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a clone of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Team Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a clone of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./tests/models/swinv2/test_modeling_swinv2.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/clip/modeling_flax_clip.py | # coding=utf-8
# Copyright 2021 The OpenAI Team Authors, The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.... | # coding=utf-8
# Copyright 2021 The OpenAI Team Authors, The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./tests/models/cpm/test_tokenization_cpm.py | # coding=utf-8
# Copyright 2018 HuggingFace Inc. team.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law o... | # coding=utf-8
# Copyright 2018 HuggingFace Inc. team.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law o... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/modeling_flax_pytorch_utils.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/wav2vec2_conformer/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/esm/openfold_utils/__init__.py | # flake8: noqa
from .chunk_utils import chunk_layer
from .data_transforms import make_atom14_masks
from .feats import atom14_to_atom37, frames_and_literature_positions_to_atom14_pos, torsion_angles_to_frames
from .loss import compute_predicted_aligned_error, compute_tm
from .protein import Protein as OFProtein
from .pr... | # flake8: noqa
from .chunk_utils import chunk_layer
from .data_transforms import make_atom14_masks
from .feats import atom14_to_atom37, frames_and_literature_positions_to_atom14_pos, torsion_angles_to_frames
from .loss import compute_predicted_aligned_error, compute_tm
from .protein import Protein as OFProtein
from .pr... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./examples/research_projects/rag/_test_finetune_rag.py | import json
import logging
import os
import sys
from pathlib import Path
import finetune_rag
from transformers.file_utils import is_apex_available
from transformers.testing_utils import (
TestCasePlus,
execute_subprocess_async,
require_ray,
require_torch_gpu,
require_torch_multi_gpu,
)
logging.ba... | import json
import logging
import os
import sys
from pathlib import Path
import finetune_rag
from transformers.file_utils import is_apex_available
from transformers.testing_utils import (
TestCasePlus,
execute_subprocess_async,
require_ray,
require_torch_gpu,
require_torch_multi_gpu,
)
logging.ba... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./docs/source/it/perf_hardware.mdx | <!---
Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | <!---
Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or ... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/canine/convert_canine_original_tf_checkpoint_to_pytorch.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/vilt/image_processing_vilt.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./examples/research_projects/rag-end2end-retriever/test_run/test_finetune.sh | # Add parent directory to python path to access lightning_base.py
export PYTHONPATH="../":"${PYTHONPATH}"
#creates the custom knowlegebase
python use_own_knowledge_dataset.py
# Start a single-node Ray cluster.
ray start --head
# A sample finetuning run, you need to specify data_dir, output_dir and model_name_or_pat... | # Add parent directory to python path to access lightning_base.py
export PYTHONPATH="../":"${PYTHONPATH}"
#creates the custom knowlegebase
python use_own_knowledge_dataset.py
# Start a single-node Ray cluster.
ray start --head
# A sample finetuning run, you need to specify data_dir, output_dir and model_name_or_pat... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./tests/models/rembert/__init__.py | -1 | ||
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./scripts/pegasus/build_test_sample_spm_no_bos.py | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | #!/usr/bin/env python
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./examples/legacy/seq2seq/rouge_cli.py | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,213 | Generate: add Bloom fixes for contrastive search | # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to convert Bloom's cache back and forth between its ... | gante | 2022-11-14T15:50:39Z | 2022-11-14T18:34:12Z | fda125638f53febc059cb67f9d7abce058a8f44f | 938cb04789afe44169fba3866bfc1d4a3eacd8ee | Generate: add Bloom fixes for contrastive search. # What does this PR do?
Bloom has a different cache format, where the batch size and the number of heads are packed in a single dimension. Contrastive search needs to manipulate the cache at the batch dimension, so naturally it fails.
This PR adds functionality to... | ./src/transformers/models/led/tokenization_led.py | # coding=utf-8
# Copyright 2021 Iz Beltagy, Matthew E. Peters, Arman Cohan and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://w... | # coding=utf-8
# Copyright 2021 Iz Beltagy, Matthew E. Peters, Arman Cohan and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://w... | -1 |