repo_name stringclasses 6
values | pr_number int64 99 20.3k | pr_title stringlengths 8 158 | pr_description stringlengths 0 6.54k | author stringlengths 4 18 | date_created timestamp[ns, tz=UTC] | date_merged timestamp[ns, tz=UTC] | previous_commit stringlengths 40 40 | pr_commit stringlengths 40 40 | query stringlengths 37 6.57k | filepath stringlengths 8 153 | before_content stringlengths 0 876M | after_content stringlengths 0 876M | label int64 -1 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/mctct/test_processor_mctct.py | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/legacy/run_chinese_ref.py | #!/usr/bin/env python
import argparse
import json
from typing import List
from ltp import LTP
from transformers import BertTokenizer
def _is_chinese_char(cp):
"""Checks whether CP is the codepoint of a CJK character."""
# This defines a "chinese character" as anything in the CJK Unicode block:
# https:... | #!/usr/bin/env python
import argparse
import json
from typing import List
from ltp import LTP
from transformers import BertTokenizer
def _is_chinese_char(cp):
"""Checks whether CP is the codepoint of a CJK character."""
# This defines a "chinese character" as anything in the CJK Unicode block:
# https:... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/sew_d/convert_sew_d_original_pytorch_checkpoint_to_pytorch.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/rag/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/pipelines/text2text_generation.py | import enum
import warnings
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
from ..models.auto.modeling_tf_auto import TF_MODE... | import enum
import warnings
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
from ..models.auto.modeling_tf_auto import TF_MODE... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/lilt/__init__.py | -1 | ||
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./utils/test_module/custom_configuration.py | from transformers import PretrainedConfig
class CustomConfig(PretrainedConfig):
model_type = "custom"
def __init__(self, attribute=1, **kwargs):
self.attribute = attribute
super().__init__(**kwargs)
class NoSuperInitConfig(PretrainedConfig):
model_type = "custom"
def __init__(self,... | from transformers import PretrainedConfig
class CustomConfig(PretrainedConfig):
model_type = "custom"
def __init__(self, attribute=1, **kwargs):
self.attribute = attribute
super().__init__(**kwargs)
class NoSuperInitConfig(PretrainedConfig):
model_type = "custom"
def __init__(self,... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/wav2vec2/test_feature_extraction_wav2vec2.py | # coding=utf-8
# Copyright 2021 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | # coding=utf-8
# Copyright 2021 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/utils/__init__.py | -1 | ||
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/generation_tf_utils.py | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | # coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a cop... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/research_projects/self-training-text-classification/selftraining.py | # coding=utf-8
# Copyright 2022 The Google Research Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicab... | # coding=utf-8
# Copyright 2022 The Google Research Authors.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicab... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/data2vec/test_modeling_data2vec_vision.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/electra/test_modeling_electra.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/pytorch/language-modeling/run_mlm_no_trainer.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LI... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LI... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/pipelines/test_pipelines_zero_shot_object_detection.py | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/openai/tokenization_openai_fast.py | # coding=utf-8
# Copyright 2018 The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# ... | # coding=utf-8
# Copyright 2018 The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# ... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/research_projects/seq2seq-distillation/utils.py | import itertools
import json
import linecache
import math
import os
import pickle
import socket
from logging import getLogger
from pathlib import Path
from typing import Callable, Dict, Iterable, List, Tuple, Union
import git
import numpy as np
import torch
import torch.distributed as dist
from rouge_score import roug... | import itertools
import json
import linecache
import math
import os
import pickle
import socket
from logging import getLogger
from pathlib import Path
from typing import Callable, Dict, Iterable, List, Tuple, Union
import git
import numpy as np
import torch
import torch.distributed as dist
from rouge_score import roug... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/flax/question-answering/utils_qa.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | # coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/legacy/run_language_modeling.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/flava/test_feature_extraction_flava.py | # coding=utf-8
# Copyright 2022 Meta Platforms authors and HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requi... | # coding=utf-8
# Copyright 2022 Meta Platforms authors and HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requi... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/research_projects/deebert/src/modeling_highway_roberta.py | from __future__ import absolute_import, division, print_function, unicode_literals
from torch import nn
from torch.nn import CrossEntropyLoss, MSELoss
from transformers import RobertaConfig
from transformers.file_utils import add_start_docstrings, add_start_docstrings_to_model_forward
from transformers.models.roberta... | from __future__ import absolute_import, division, print_function, unicode_literals
from torch import nn
from torch.nn import CrossEntropyLoss, MSELoss
from transformers import RobertaConfig
from transformers.file_utils import add_start_docstrings, add_start_docstrings_to_model_forward
from transformers.models.roberta... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/pipelines/question_answering.py | import types
import warnings
from collections.abc import Iterable
from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Union
import numpy as np
from ..data import SquadExample, SquadFeatures, squad_convert_examples_to_features
from ..modelcard import ModelCard
from ..tokenization_utils import PreTrainedToke... | import types
import warnings
from collections.abc import Iterable
from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Union
import numpy as np
from ..data import SquadExample, SquadFeatures, squad_convert_examples_to_features
from ..modelcard import ModelCard
from ..tokenization_utils import PreTrainedToke... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/research_projects/rag/lightning_base.py | import argparse
import logging
import os
from pathlib import Path
from typing import Any, Dict
import pytorch_lightning as pl
from pytorch_lightning.utilities import rank_zero_info
from transformers import (
AdamW,
AutoConfig,
AutoModel,
AutoModelForPreTraining,
AutoModelForQuestionAnswering,
... | import argparse
import logging
import os
from pathlib import Path
from typing import Any, Dict
import pytorch_lightning as pl
from pytorch_lightning.utilities import rank_zero_info
from transformers import (
AdamW,
AutoConfig,
AutoModel,
AutoModelForPreTraining,
AutoModelForQuestionAnswering,
... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/lxmert/modeling_lxmert.py | # coding=utf-8
# Copyright 2018 Hao Tan, Mohit Bansal, and the HuggingFace team
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2018 Hao Tan, Mohit Bansal, and the HuggingFace team
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/swinv2/configuration_swinv2.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/prophetnet/convert_prophetnet_original_pytorch_checkpoint_to_pytorch.py | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2020 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/model_doc/dit.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/fnet/test_modeling_fnet.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/tasks/asr.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/model_doc/xlnet.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/tasks/question_answering.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/convnext/modeling_tf_convnext.py | # coding=utf-8
# Copyright 2022 Meta Platforms Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/... | # coding=utf-8
# Copyright 2022 Meta Platforms Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/legacy/run_openai_gpt.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in co... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in co... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/models/segformer/test_feature_extraction_segformer.py | # coding=utf-8
# Copyright 2021 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | # coding=utf-8
# Copyright 2021 HuggingFace Inc.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or ag... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/model_doc/t5.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/research_projects/rag/finetune_rag_ray.sh | # Sample script to finetune RAG using Ray for distributed retrieval.
# Add parent directory to python path to access lightning_base.py
export PYTHONPATH="../":"${PYTHONPATH}"
# Start a single-node Ray cluster.
ray start --head
# A sample finetuning run, you need to specify data_dir, output_dir and model_name_or_path... | # Sample script to finetune RAG using Ray for distributed retrieval.
# Add parent directory to python path to access lightning_base.py
export PYTHONPATH="../":"${PYTHONPATH}"
# Start a single-node Ray cluster.
ray start --head
# A sample finetuning run, you need to specify data_dir, output_dir and model_name_or_path... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/model_doc/ctrl.mdx | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2020 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/legacy/seq2seq/test_data/wmt_en_ro/val.len | ]q (KKKKKKKKKKgKKKKKKe. | ]q (KKKKKKKKKKgKKKKKKe. | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./.git/objects/1f/434df347b620d89cca7cf5f51b94060f78d9da | x[[wFg<(NUb9#Ht29&4@&$v6/@wWW
4=_ŦLJx)śzH(b,U(T$%JdHMb)9u6b1#},Ugb?]% lZJJD<Iw,*d"WEDM%oc
O#/6"DT1UUކ#b7GQoO>>8 a^>K\'%:݈ >q4it+RDRb 2@PJ $gdZW[28?2]@|utqrOoϏN'ݹ髓ɻS<G? R' !uB
'$ƹE-XkYf8(dJ1SJUREi$Av2`ļW$R2;}{!D) ]OCp|7Qfi;]%M3E\ʨҧ7og:M)fB)SAʦ%i)h%gbˁg.bO... | x[[wFg<(NUb9#Ht29&4@&$v6/@wWW
4=_ŦLJx)śzH(b,U(T$%JdHMb)9u6b1#},Ugb?]% lZJJD<Iw,*d"WEDM%oc
O#/6"DT1UUކ#b7GQoO>>8 a^>K\'%:݈ >q4it+RDRb 2@PJ $gdZW[28?2]@|utqrOoϏN'ݹ髓ɻS<G? R' !uB
'$ƹE-XkYf8(dJ1SJUREi$Av2`ļW$R2;}{!D) ]OCp|7Qfi;]%M3E\ʨҧ7og:M)fB)SAʦ%i)h%gbˁg.bO... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docs/source/en/model_doc/van.mdx | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | <!--Copyright 2022 The HuggingFace Team. All rights reserved.
Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with
the License. You may obtain a copy of the License at
http://www.apache.org/licenses/LICENSE-2.0
Unless required by applicable law or agreed... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/fixtures/tests_samples/GermEval/labels.txt | B-LOC
B-LOCderiv
B-LOCpart
B-ORG
B-ORGderiv
B-ORGpart
B-OTH
B-OTHderiv
B-OTHpart
B-PER
B-PERderiv
B-PERpart
I-LOC
I-LOCderiv
I-LOCpart
I-ORG
I-ORGderiv
I-ORGpart
I-OTH
I-OTHderiv
I-OTHpart
I-PER
I-PERderiv
I-PERpart
O
| B-LOC
B-LOCderiv
B-LOCpart
B-ORG
B-ORGderiv
B-ORGpart
B-OTH
B-OTHderiv
B-OTHpart
B-PER
B-PERderiv
B-PERpart
I-LOC
I-LOCderiv
I-LOCpart
I-ORG
I-ORGderiv
I-ORGpart
I-OTH
I-OTHderiv
I-OTHpart
I-PER
I-PERderiv
I-PERpart
O
| -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/openai/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./tests/utils/test_cli.py | # coding=utf-8
# Copyright 2019-present, the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by a... | # coding=utf-8
# Copyright 2019-present, the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by a... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./src/transformers/models/led/tokenization_led_fast.py | # coding=utf-8
# Copyright 2021 Iz Beltagy, Matthew E. Peters, Arman Cohan and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://w... | # coding=utf-8
# Copyright 2021 Iz Beltagy, Matthew E. Peters, Arman Cohan and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://w... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./docker/transformers-pytorch-cpu/Dockerfile | FROM ubuntu:18.04
LABEL maintainer="Hugging Face"
LABEL repository="transformers"
RUN apt update && \
apt install -y bash \
build-essential \
git \
curl \
ca-certificates \
python3 \
python3-pip && \
... | FROM ubuntu:18.04
LABEL maintainer="Hugging Face"
LABEL repository="transformers"
RUN apt update && \
apt install -y bash \
build-essential \
git \
curl \
ca-certificates \
python3 \
python3-pip && \
... | -1 |
huggingface/transformers | 20,325 | Add LayerScale to NAT/DiNAT |
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] This PR fixes a typo or improv... | alihassanijr | 2022-11-18T22:01:53Z | 2022-11-21T14:08:35Z | d28448c5cd8fa8dfb64190c7f55275d80e256a9e | 11f3ec7224c83c9e5c379a774b9d3984e68d26fa | Add LayerScale to NAT/DiNAT.
# What does this PR do?
This follows PR #20219 .
I completely dropped the ball on LayerScale in the original PR.
This is just an optional argument in both models, and is only activated for larger variants in order to provide training stability.
## Before submitting
- [ ] T... | ./examples/research_projects/lxmert/visualizing_image.py | """
coding=utf-8
Copyright 2018, Antonio Mendoza Hao Tan, Mohit Bansal
Adapted From Facebook Inc, Detectron2
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/license... | """
coding=utf-8
Copyright 2018, Antonio Mendoza Hao Tan, Mohit Bansal
Adapted From Facebook Inc, Detectron2
Licensed under the Apache License, Version 2.0 (the "License");
you may not use this file except in compliance with the License.
You may obtain a copy of the License at
http://www.apache.org/license... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/audio_classification.py | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/automatic_speech_recognition.py | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/conversational.py | import uuid
from typing import Any, Dict, List, Optional, Union
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
if is_torch_available():
import torch
logger = logging.get_logge... | import uuid
from typing import Any, Dict, List, Optional, Union
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
if is_torch_available():
import torch
logger = logging.get_logge... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/depth_estimation.py | from typing import List, Union
import numpy as np
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_avai... | from typing import List, Union
import numpy as np
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_avai... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/document_question_answering.py | # Copyright 2022 The Impira Team and the HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # Copyright 2022 The Impira Team and the HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/feature_extraction.py | from typing import Dict
from .base import GenericTensor, Pipeline
# Can't use @add_end_docstrings(PIPELINE_INIT_ARGS) here because this one does not accept `binary_output`
class FeatureExtractionPipeline(Pipeline):
"""
Feature extraction pipeline using no model head. This pipeline extracts the hidden states ... | from typing import Dict
from .base import GenericTensor, Pipeline
# Can't use @add_end_docstrings(PIPELINE_INIT_ARGS) here because this one does not accept `binary_output`
class FeatureExtractionPipeline(Pipeline):
"""
Feature extraction pipeline using no model head. This pipeline extracts the hidden states ... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/fill_mask.py | from typing import Dict
import numpy as np
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, GenericTensor, Pipeline, PipelineException
if is_tf_available():
import tensorflow as tf
from ..tf_utils import stable_softmax
if is_torch_... | from typing import Dict
import numpy as np
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, GenericTensor, Pipeline, PipelineException
if is_tf_available():
import tensorflow as tf
from ..tf_utils import stable_softmax
if is_torch_... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/image_classification.py | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import lo... | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import lo... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/image_to_text.py | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import lo... | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import lo... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/object_detection.py | from typing import Any, Dict, List, Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from ..image_utils import load_image
if is_torch_available():
import torch
from... | from typing import Any, Dict, List, Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from ..image_utils import load_image
if is_torch_available():
import torch
from... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/question_answering.py | import types
import warnings
from collections.abc import Iterable
from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Union
import numpy as np
from ..data import SquadExample, SquadFeatures, squad_convert_examples_to_features
from ..modelcard import ModelCard
from ..tokenization_utils import PreTrainedToke... | import types
import warnings
from collections.abc import Iterable
from typing import TYPE_CHECKING, Dict, List, Optional, Tuple, Union
import numpy as np
from ..data import SquadExample, SquadFeatures, squad_convert_examples_to_features
from ..modelcard import ModelCard
from ..tokenization_utils import PreTrainedToke... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/table_question_answering.py | import collections
import types
import numpy as np
from ..utils import (
add_end_docstrings,
is_tensorflow_probability_available,
is_tf_available,
is_torch_available,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, ArgumentHandler, Dataset, Pipeline, PipelineException
if is_torch_avai... | import collections
import types
import numpy as np
from ..utils import (
add_end_docstrings,
is_tensorflow_probability_available,
is_tf_available,
is_torch_available,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, ArgumentHandler, Dataset, Pipeline, PipelineException
if is_torch_avai... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/text2text_generation.py | import enum
import warnings
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
from ..models.auto.modeling_tf_auto import TF_MODE... | import enum
import warnings
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, is_tf_available, is_torch_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
from ..models.auto.modeling_tf_auto import TF_MODE... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/text_classification.py | import warnings
from typing import Dict
import numpy as np
from ..utils import ExplicitEnum, add_end_docstrings, is_tf_available, is_torch_available
from .base import PIPELINE_INIT_ARGS, GenericTensor, Pipeline
if is_tf_available():
from ..models.auto.modeling_tf_auto import TF_MODEL_FOR_SEQUENCE_CLASSIFICATION... | import warnings
from typing import Dict
import numpy as np
from ..utils import ExplicitEnum, add_end_docstrings, is_tf_available, is_torch_available
from .base import PIPELINE_INIT_ARGS, GenericTensor, Pipeline
if is_tf_available():
from ..models.auto.modeling_tf_auto import TF_MODEL_FOR_SEQUENCE_CLASSIFICATION... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/text_generation.py | import enum
import warnings
from transformers import MODEL_FOR_CAUSAL_LM_MAPPING, TF_MODEL_FOR_CAUSAL_LM_MAPPING
from ..utils import add_end_docstrings, is_tf_available
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
class ReturnType(enum.Enum):
TENSORS = 0
... | import enum
import warnings
from transformers import MODEL_FOR_CAUSAL_LM_MAPPING, TF_MODEL_FOR_CAUSAL_LM_MAPPING
from ..utils import add_end_docstrings, is_tf_available
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_tf_available():
import tensorflow as tf
class ReturnType(enum.Enum):
TENSORS = 0
... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/token_classification.py | import types
import warnings
from typing import List, Optional, Tuple, Union
import numpy as np
from ..models.bert.tokenization_bert import BasicTokenizer
from ..utils import ExplicitEnum, add_end_docstrings, is_tf_available, is_torch_available
from .base import PIPELINE_INIT_ARGS, ArgumentHandler, Dataset, Pipeline
... | import types
import warnings
from typing import List, Optional, Tuple, Union
import numpy as np
from ..models.bert.tokenization_bert import BasicTokenizer
from ..utils import ExplicitEnum, add_end_docstrings, is_tf_available, is_torch_available
from .base import PIPELINE_INIT_ARGS, ArgumentHandler, Dataset, Pipeline
... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/visual_question_answering.py | from typing import Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_available():
from ..models.auto.modeling_auto... | from typing import Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging
from .base import PIPELINE_INIT_ARGS, Pipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_available():
from ..models.auto.modeling_auto... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/zero_shot_classification.py | from typing import List, Union
import numpy as np
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, logging
from .base import PIPELINE_INIT_ARGS, ArgumentHandler, ChunkPipeline
logger = logging.get_logger(__name__)
class ZeroShotClassificationArgumentHandler(ArgumentHandl... | from typing import List, Union
import numpy as np
from ..tokenization_utils import TruncationStrategy
from ..utils import add_end_docstrings, logging
from .base import PIPELINE_INIT_ARGS, ArgumentHandler, ChunkPipeline
logger = logging.get_logger(__name__)
class ZeroShotClassificationArgumentHandler(ArgumentHandl... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/zero_shot_image_classification.py | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, ChunkPipeline
if is_vision_available():
from PIL import Image
from ..image_utils impo... | from typing import List, Union
from ..utils import (
add_end_docstrings,
is_tf_available,
is_torch_available,
is_vision_available,
logging,
requires_backends,
)
from .base import PIPELINE_INIT_ARGS, ChunkPipeline
if is_vision_available():
from PIL import Image
from ..image_utils impo... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/pipelines/zero_shot_object_detection.py | from typing import Any, Dict, List, Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends
from .base import PIPELINE_INIT_ARGS, ChunkPipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_availabl... | from typing import Any, Dict, List, Union
from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends
from .base import PIPELINE_INIT_ARGS, ChunkPipeline
if is_vision_available():
from PIL import Image
from ..image_utils import load_image
if is_torch_availabl... | 1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/auto/feature_extraction_auto.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/bloom/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./examples/research_projects/bertabs/utils_summarization.py | import os
from collections import deque
import torch
from torch.utils.data import Dataset
# ------------
# Data loading
# ------------
class CNNDMDataset(Dataset):
"""Abstracts the dataset used to train seq2seq models.
The class will process the documents that are located in the specified
folder. The ... | import os
from collections import deque
import torch
from torch.utils.data import Dataset
# ------------
# Data loading
# ------------
class CNNDMDataset(Dataset):
"""Abstracts the dataset used to train seq2seq models.
The class will process the documents that are located in the specified
folder. The ... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/wav2vec2/modeling_wav2vec2.py | # coding=utf-8
# Copyright 2021 The Fairseq Authors and the HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/... | # coding=utf-8
# Copyright 2021 The Fairseq Authors and the HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/xlm/modeling_xlm.py | # coding=utf-8
# Copyright 2019-present, Facebook, Inc and the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Un... | # coding=utf-8
# Copyright 2019-present, Facebook, Inc and the HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Un... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/bert/convert_bert_pytorch_checkpoint_to_original_tf.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/deformable_detr/convert_deformable_detr_to_pytorch.py | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2022 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/ibert/test_modeling_ibert.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/clip/tokenization_clip_fast.py | # coding=utf-8
# Copyright 2021 The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# ... | # coding=utf-8
# Copyright 2021 The Open AI Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# ... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/bert/modeling_flax_bert.py | # coding=utf-8
# Copyright 2021 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | # coding=utf-8
# Copyright 2021 The Google Flax Team Authors and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/mbart/modeling_mbart.py | # coding=utf-8
# Copyright 2021, The Facebook AI Research Team and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.or... | # coding=utf-8
# Copyright 2021, The Facebook AI Research Team and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.or... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/lilt/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./examples/research_projects/performer/run_mlm_performer.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | # coding=utf-8
# Copyright 2020 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless require... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/regnet/configuration_regnet.py | # coding=utf-8
# Copyright 2022 Meta Platforms, Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses... | # coding=utf-8
# Copyright 2022 Meta Platforms, Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./examples/pytorch/contrastive-image-text/run_clip.py | #!/usr/bin/env python
# coding=utf-8
# Copyright 2022 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | #!/usr/bin/env python
# coding=utf-8
# Copyright 2022 The HuggingFace Team All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/utils/sentencepiece_model_pb2.py | # flake8: noqa
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: sentencepiece_model.proto
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obta... | # flake8: noqa
# Generated by the protocol buffer compiler. DO NOT EDIT!
# source: sentencepiece_model.proto
# Copyright 2022 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obta... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/megatron_bert/__init__.py | -1 | ||
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/barthez/test_tokenization_barthez.py | # coding=utf-8
# Copyright 2020 Ecole Polytechnique and HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless re... | # coding=utf-8
# Copyright 2020 Ecole Polytechnique and HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless re... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/clip/test_modeling_clip.py | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | # coding=utf-8
# Copyright 2021 The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless r... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/speech_to_text/test_processor_speech_to_text.py | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | # Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicabl... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/phobert/test_tokenization_phobert.py | # coding=utf-8
# Copyright 2018 Salesforce and HuggingFace Inc. team.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by a... | # coding=utf-8
# Copyright 2018 Salesforce and HuggingFace Inc. team.
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by a... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/speech_to_text_2/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/sew/__init__.py | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | # flake8: noqa
# There's no way to ignore "F401 '...' imported but unused" warnings in this
# module, but to preserve other warnings. So, don't check this module at all.
# Copyright 2021 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use thi... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/sew_d/configuration_sew_d.py | # coding=utf-8
# Copyright 2021 ASAPP Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2... | # coding=utf-8
# Copyright 2021 ASAPP Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/openai/configuration_openai.py | # coding=utf-8
# Copyright 2018 The OpenAI Team Authors and HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License... | # coding=utf-8
# Copyright 2018 The OpenAI Team Authors and HuggingFace Inc. team.
# Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/mvp/test_modeling_mvp.py | # coding=utf-8
# Copyright 2021, The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless ... | # coding=utf-8
# Copyright 2021, The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless ... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./utils/past_ci_versions.py | import argparse
import os
past_versions_testing = {
"pytorch": {
"1.11": {
"torch": "1.11.0",
"torchvision": "0.12.0",
"torchaudio": "0.11.0",
"python": 3.9,
"cuda": "cu113",
"install": (
"python3 -m pip install --no-c... | import argparse
import os
past_versions_testing = {
"pytorch": {
"1.11": {
"torch": "1.11.0",
"torchvision": "0.12.0",
"torchaudio": "0.11.0",
"python": 3.9,
"cuda": "cu113",
"install": (
"python3 -m pip install --no-c... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/longformer/test_modeling_tf_longformer.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | # coding=utf-8
# Copyright 2021 The Facebook, Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | # coding=utf-8
# Copyright 2021 The Facebook, Inc. and The HuggingFace Inc. team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/L... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./tests/models/camembert/test_modeling_camembert.py | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | # coding=utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless requir... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/tapas/configuration_tapas.py | # coding=utf-8
# Copyright 2020 Google Research and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless re... | # coding=utf-8
# Copyright 2020 Google Research and The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless re... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./src/transformers/models/retribert/tokenization_retribert.py | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | # coding=utf-8
# Copyright 2018 The HuggingFace Inc. team.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable... | -1 |
huggingface/transformers | 20,307 | Remove double brackets | Fixes a small typo in the pipeline docs where there were two brackets. | stevhliu | 2022-11-17T19:40:39Z | 2022-11-18T17:29:24Z | f10cdba22e1a91a8f0774b75de3d2a3826ecb8cc | b2c863a3196150850d17548f25ee0575bccb8224 | Remove double brackets. Fixes a small typo in the pipeline docs where there were two brackets. | ./scripts/fsmt/fsmt-make-tiny-model.py | #!/usr/bin/env python
# coding: utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENS... | #!/usr/bin/env python
# coding: utf-8
# Copyright 2020 The HuggingFace Team. All rights reserved.
#
# Licensed under the Apache License, Version 2.0 (the "License");
# you may not use this file except in compliance with the License.
# You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENS... | -1 |