repo_name
stringclasses
6 values
pr_number
int64
99
20.3k
pr_title
stringlengths
8
158
pr_description
stringlengths
0
6.54k
author
stringlengths
4
18
date_created
timestamp[ns, tz=UTC]
date_merged
timestamp[ns, tz=UTC]
previous_commit
stringlengths
40
40
pr_commit
stringlengths
40
40
query
stringlengths
37
6.57k
filepath
stringlengths
8
153
before_content
stringlengths
0
876M
after_content
stringlengths
0
876M
label
int64
-1
1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/electra/test_modeling_electra.py
# coding=utf-8 # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless requir...
# coding=utf-8 # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless requir...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/main_classes/data_collator.mdx
<!--Copyright 2020 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2020 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/mobilevit/feature_extraction_mobilevit.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/codegen/tokenization_codegen_fast.py
# coding=utf-8 # Copyright 2022 The Salesforce authors, The Open AI Team Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/l...
# coding=utf-8 # Copyright 2022 The Salesforce authors, The Open AI Team Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/l...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/rag-end2end-retriever/callbacks_rag.py
import logging from pathlib import Path import numpy as np import pytorch_lightning as pl import torch from pytorch_lightning.callbacks import EarlyStopping, ModelCheckpoint from pytorch_lightning.utilities import rank_zero_only from utils_rag import save_json def count_trainable_parameters(model): model_parame...
import logging from pathlib import Path import numpy as np import pytorch_lightning as pl import torch from pytorch_lightning.callbacks import EarlyStopping, ModelCheckpoint from pytorch_lightning.utilities import rank_zero_only from utils_rag import save_json def count_trainable_parameters(model): model_parame...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/model_doc/segformer.mdx
<!--Copyright 2021 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2021 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./MANIFEST.in
include LICENSE
include LICENSE
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/seq2seq-distillation/distil_marian_no_teacher.sh
#!/usr/bin/env bash export PYTHONPATH="../":"${PYTHONPATH}" export WANDB_PROJECT=dmar export MAX_LEN=128 python finetune.py \ --learning_rate=3e-4 \ --do_train \ --do_predict \ --fp16 \ --val_check_interval 0.25 \ --data_dir $ENRO_DIR \ --max_source_length $MAX_LEN --max_target_length $MAX_LEN --val_max_t...
#!/usr/bin/env bash export PYTHONPATH="../":"${PYTHONPATH}" export WANDB_PROJECT=dmar export MAX_LEN=128 python finetune.py \ --learning_rate=3e-4 \ --do_train \ --do_predict \ --fp16 \ --val_check_interval 0.25 \ --data_dir $ENRO_DIR \ --max_source_length $MAX_LEN --max_target_length $MAX_LEN --val_max_t...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/convbert/test_modeling_convbert.py
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/marian/tokenization_marian.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/realm/test_tokenization_realm.py
# coding=utf-8 # Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless requir...
# coding=utf-8 # Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless requir...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/rembert/test_modeling_tf_rembert.py
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/squeezebert/__init__.py
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/pipelines/test_pipelines_text_generation.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/pipelines/test_pipelines_translation.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/fnet/tokenization_fnet_fast.py
# coding=utf-8 # Copyright 2021 Google AI, Google Brain and the HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # U...
# coding=utf-8 # Copyright 2021 Google AI, Google Brain and the HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # U...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/levit/__init__.py
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/seq2seq-distillation/lightning_base.py
import argparse import logging import os from pathlib import Path from typing import Any, Dict import pytorch_lightning as pl from pytorch_lightning.utilities import rank_zero_info from transformers import ( AdamW, AutoConfig, AutoModel, AutoModelForPreTraining, AutoModelForQuestionAnswering, ...
import argparse import logging import os from pathlib import Path from typing import Any, Dict import pytorch_lightning as pl from pytorch_lightning.utilities import rank_zero_info from transformers import ( AdamW, AutoConfig, AutoModel, AutoModelForPreTraining, AutoModelForQuestionAnswering, ...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/deepspeed/ds_config_zero3.json
{ "fp16": { "enabled": "auto", "loss_scale": 0, "loss_scale_window": 1000, "initial_scale_power": 16, "hysteresis": 2, "min_loss_scale": 1 }, "bf16": { "enabled": "auto" }, "optimizer": { "type": "AdamW", "params": { ...
{ "fp16": { "enabled": "auto", "loss_scale": 0, "loss_scale_window": 1000, "initial_scale_power": 16, "hysteresis": 2, "min_loss_scale": 1 }, "bf16": { "enabled": "auto" }, "optimizer": { "type": "AdamW", "params": { ...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/pytorch/audio-classification/run_audio_classification.py
#!/usr/bin/env python # coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LI...
#!/usr/bin/env python # coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LI...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/canine/configuration_canine.py
# coding=utf-8 # Copyright Google AI and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # #...
# coding=utf-8 # Copyright Google AI and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # #...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/model_doc/convbert.mdx
<!--Copyright 2020 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2020 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/xlm_roberta/configuration_xlm_roberta.py
# coding=utf-8 # Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team. # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a cop...
# coding=utf-8 # Copyright 2018 The Google AI Language Team Authors and The HuggingFace Inc. team. # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a cop...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/rag/configuration_rag.py
# coding=utf-8 # Copyright 2020, The RAG Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2020, The RAG Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/legacy/token-classification/tasks.py
import logging import os from typing import List, TextIO, Union from conllu import parse_incr from utils_ner import InputExample, Split, TokenClassificationTask logger = logging.getLogger(__name__) class NER(TokenClassificationTask): def __init__(self, label_idx=-1): # in NER datasets, the last column...
import logging import os from typing import List, TextIO, Union from conllu import parse_incr from utils_ner import InputExample, Split, TokenClassificationTask logger = logging.getLogger(__name__) class NER(TokenClassificationTask): def __init__(self, label_idx=-1): # in NER datasets, the last column...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./scripts/distributed/torch-distributed-gpu-test.py
#!/usr/bin/env python # # This a `torch.distributed` diagnostics script that checks that all GPUs in the cluster (one or # many nodes) can talk to each other via nccl and allocate gpu memory. # # To run first adjust the number of processes and nodes: # # python -m torch.distributed.run --nproc_per_node 2 --nnodes 1 to...
#!/usr/bin/env python # # This a `torch.distributed` diagnostics script that checks that all GPUs in the cluster (one or # many nodes) can talk to each other via nccl and allocate gpu memory. # # To run first adjust the number of processes and nodes: # # python -m torch.distributed.run --nproc_per_node 2 --nnodes 1 to...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/funnel/test_modeling_funnel.py
# coding=utf-8 # Copyright 2020 HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law...
# coding=utf-8 # Copyright 2020 HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable law...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/pipelines/test_pipelines_image_segmentation.py
# Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/mbart/modeling_mbart.py
# coding=utf-8 # Copyright 2021, The Facebook AI Research Team and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.or...
# coding=utf-8 # Copyright 2021, The Facebook AI Research Team and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.or...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/pipelines/__init__.py
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./.git/hooks/applypatch-msg.sample
#!/bin/sh # # An example hook script to check the commit log message taken by # applypatch from an e-mail message. # # The hook should exit with non-zero status after issuing an # appropriate message if it wants to stop the commit. The hook is # allowed to edit the commit message file. # # To enable this hook, rename ...
#!/bin/sh # # An example hook script to check the commit log message taken by # applypatch from an e-mail message. # # The hook should exit with non-zero status after issuing an # appropriate message if it wants to stop the commit. The hook is # allowed to edit the commit message file. # # To enable this hook, rename ...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/tasks/translation.mdx
<!--Copyright 2022 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2022 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/onnx/test_onnx_v2.py
import os from pathlib import Path from tempfile import NamedTemporaryFile from unittest import TestCase from unittest.mock import patch import pytest from parameterized import parameterized from transformers import AutoConfig, PreTrainedTokenizerBase, is_tf_available, is_torch_available from transformers.onnx import...
import os from pathlib import Path from tempfile import NamedTemporaryFile from unittest import TestCase from unittest.mock import patch import pytest from parameterized import parameterized from transformers import AutoConfig, PreTrainedTokenizerBase, is_tf_available, is_torch_available from transformers.onnx import...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py
#!/usr/bin/env python # coding=utf-8 # Copyright 2021 The HuggingFace Team All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-...
#!/usr/bin/env python # coding=utf-8 # Copyright 2021 The HuggingFace Team All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/transfo_xl/tokenization_transfo_xl.py
# coding=utf-8 # Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team. # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the Lice...
# coding=utf-8 # Copyright 2018 Google AI, Google Brain and Carnegie Mellon University Authors and the HuggingFace Inc. team. # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the Lice...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/clipseg/test_processor_clipseg.py
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/vit/modeling_tf_vit.py
# coding=utf-8 # Copyright 2021 Google AI, Ross Wightman, The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/license...
# coding=utf-8 # Copyright 2021 Google AI, Ross Wightman, The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/license...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/model_doc/deit.mdx
<!--Copyright 2021 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2021 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/flava/__init__.py
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2022 Meta Platforms authors and The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "Licen...
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2022 Meta Platforms authors and The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "Licen...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/wav2vec2/convert_wav2vec2_original_s3prl_checkpoint_to_pytorch.py
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/squeezebert/__init__.py
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use thi...
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use thi...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/fixtures/tests_samples/MRPC/train.csv
label,sentence1,sentence2 equivalent,He said the foodservice pie business doesn 't fit the company 's long-term growth strategy .,""" The foodservice pie business does not fit our long-term growth strategy ." not_equivalent,Magnarelli said Racicot hated the Iraqi regime and looked forward to using his long years of tra...
label,sentence1,sentence2 equivalent,He said the foodservice pie business doesn 't fit the company 's long-term growth strategy .,""" The foodservice pie business does not fit our long-term growth strategy ." not_equivalent,Magnarelli said Racicot hated the Iraqi regime and looked forward to using his long years of tra...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/deberta/tokenization_deberta.py
# coding=utf-8 # Copyright 2020 Microsoft and the HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required...
# coding=utf-8 # Copyright 2020 Microsoft and the HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/opt/test_modeling_opt.py
# coding=utf-8 # Copyright 2021, The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless ...
# coding=utf-8 # Copyright 2021, The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless ...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/utils/test_activations.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/flava/test_feature_extraction_flava.py
# coding=utf-8 # Copyright 2022 Meta Platforms authors and HuggingFace Inc. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless requi...
# coding=utf-8 # Copyright 2022 Meta Platforms authors and HuggingFace Inc. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless requi...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py
# coding=utf-8 # Copyright 2020 The Google AI Language Team Authors, The HuggingFace Inc. team and Microsoft Corporation. # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License....
# coding=utf-8 # Copyright 2020 The Google AI Language Team Authors, The HuggingFace Inc. team and Microsoft Corporation. # Copyright (c) 2018, NVIDIA CORPORATION. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License....
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/opt/modeling_flax_opt.py
# coding=utf-8 # Copyright 2022 The Fairseq Authors and The Google Flax Team Authors And The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # ...
# coding=utf-8 # Copyright 2022 The Fairseq Authors and The Google Flax Team Authors And The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # ...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/trocr/configuration_trocr.py
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/model_doc/layoutlmv2.mdx
<!--Copyright 2021 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2021 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/en/model_doc/swin.mdx
<!--Copyright 2022 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2022 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./templates/adding_a_new_model/tests/pt-encoder-bert-tokenizer.json
{ "modelname": "TemplatePT", "uppercase_modelname": "TEMPLATE_PT", "lowercase_modelname": "template_pt", "camelcase_modelname": "TemplatePt", "authors": "The HuggingFace Team", "checkpoint_identifier": "brand-new-bert-base-cased", "tokenizer_type": "Based on BERT", "generate_tensorflow_pytorch_and_flax"...
{ "modelname": "TemplatePT", "uppercase_modelname": "TEMPLATE_PT", "lowercase_modelname": "template_pt", "camelcase_modelname": "TemplatePt", "authors": "The HuggingFace Team", "checkpoint_identifier": "brand-new-bert-base-cased", "tokenizer_type": "Based on BERT", "generate_tensorflow_pytorch_and_flax"...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/legacy/seq2seq/test_data/wmt_en_ro/val.target
Fostul șef al cabinetului prezidențial brazilian este adus în fața instanței Marți, un judecător federal a acceptat acuzațiile aduse împotriva fostului șef al cabinetului prezidențial brazilian pentru presupusa implicare a acestuia într-o schemă masivă de corupție privind compania petrolieră de stat Petrobras. Biroul p...
Fostul șef al cabinetului prezidențial brazilian este adus în fața instanței Marți, un judecător federal a acceptat acuzațiile aduse împotriva fostului șef al cabinetului prezidențial brazilian pentru presupusa implicare a acestuia într-o schemă masivă de corupție privind compania petrolieră de stat Petrobras. Biroul p...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/data/datasets/language_modeling.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/legacy/seq2seq/train_distil_marian_enro_tpu.sh
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/seq2seq-distillation/distillation.py
#!/usr/bin/env python import argparse import gc import os import sys from pathlib import Path from typing import List # noqa: F401 import pytorch_lightning as pl import torch from torch import nn from finetune import SummarizationModule, TranslationModule from finetune import main as ft_main from make_student impor...
#!/usr/bin/env python import argparse import gc import os import sys from pathlib import Path from typing import List # noqa: F401 import pytorch_lightning as pl import torch from torch import nn from finetune import SummarizationModule, TranslationModule from finetune import main as ft_main from make_student impor...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/__init__.py
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use thi...
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use thi...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/es/_toctree.yml
- sections: - local: index title: 🤗 Transformers - local: quicktour title: Tour rápido - local: installation title: Instalación title: Empezar - sections: - local: pipeline_tutorial title: Pipelines para inferencia - local: autoclass_tutorial title: Carga instancias preentrenadas con un...
- sections: - local: index title: 🤗 Transformers - local: quicktour title: Tour rápido - local: installation title: Instalación title: Empezar - sections: - local: pipeline_tutorial title: Pipelines para inferencia - local: autoclass_tutorial title: Carga instancias preentrenadas con un...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/deformable_detr/__init__.py
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./tests/models/openai/__init__.py
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/whisper/modeling_whisper.py
# coding=utf-8 # Copyright 2022 The OpenAI Authors and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/L...
# coding=utf-8 # Copyright 2022 The OpenAI Authors and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/L...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./docs/source/es/autoclass_tutorial.mdx
<!--Copyright 2022 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
<!--Copyright 2022 The HuggingFace Team. All rights reserved. Licensed under the Apache License, Version 2.0 (the "License"); you may not use this file except in compliance with the License. You may obtain a copy of the License at http://www.apache.org/licenses/LICENSE-2.0 Unless required by applicable law or agreed...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/trajectory_transformer/configuration_trajectory_transformer.py
# coding=utf-8 # Copyright 2022 The Trajectory Transformers paper authors and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://ww...
# coding=utf-8 # Copyright 2022 The Trajectory Transformers paper authors and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://ww...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./utils/test_module/custom_tokenization.py
from transformers import BertTokenizer class CustomTokenizer(BertTokenizer): pass
from transformers import BertTokenizer class CustomTokenizer(BertTokenizer): pass
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/mobilevit/convert_mlcvnets_to_pytorch.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/rag/modeling_tf_rag.py
# coding=utf-8 # Copyright 2020, The RAG Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2020, The RAG Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/segformer/convert_segformer_original_to_pytorch.py
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
# coding=utf-8 # Copyright 2021 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/whisper/modeling_tf_whisper.py
# coding=utf-8 # Copyright 2022 The OpenAI Authors and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/L...
# coding=utf-8 # Copyright 2022 The OpenAI Authors and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/L...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/pipelines/depth_estimation.py
from typing import List, Union import numpy as np from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends from .base import PIPELINE_INIT_ARGS, Pipeline if is_vision_available(): from PIL import Image from ..image_utils import load_image if is_torch_avai...
from typing import List, Union import numpy as np from ..utils import add_end_docstrings, is_torch_available, is_vision_available, logging, requires_backends from .base import PIPELINE_INIT_ARGS, Pipeline if is_vision_available(): from PIL import Image from ..image_utils import load_image if is_torch_avai...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/legacy/seq2seq/sentence_splitter.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/mluke/__init__.py
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use thi...
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2021 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use thi...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./scripts/pegasus/build_test_sample_spm_no_bos.py
#!/usr/bin/env python # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless...
#!/usr/bin/env python # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./examples/research_projects/codeparrot/scripts/pretokenizing.py
import multiprocessing import time from datasets import load_dataset from arguments import PretokenizationArguments from transformers import AutoTokenizer, HfArgumentParser def tokenize(example): output = dict() output["input_ids"] = tokenizer(example["content"], truncation=False)["input_ids"] output["r...
import multiprocessing import time from datasets import load_dataset from arguments import PretokenizationArguments from transformers import AutoTokenizer, HfArgumentParser def tokenize(example): output = dict() output["input_ids"] = tokenizer(example["content"], truncation=False)["input_ids"] output["r...
-1
huggingface/transformers
20,209
Add gpt-sw3 model to transformers
This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models are English, Swedish, Norwegian, D...
ekgren
2022-11-14T14:04:00Z
2022-12-12T18:12:13Z
b58beebe7286bf53a80f137e0e5cd100ccb77ae2
5f94855dc31242d15d755b0d97ec6a0479ee0ea9
Add gpt-sw3 model to transformers. This adds the gpt-sw3 models and tokenizer to hf. The models are developed by AI Sweden and others. They are gpt models trained from scratch with the nemo-megatron framework and will initially range in sizes from 128m to 20B. The models are multilingual and the languages in the models...
./src/transformers/models/gpt2/tokenization_gpt2_fast.py
# coding=utf-8 # Copyright 2018 The Open AI Team Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # ...
# coding=utf-8 # Copyright 2018 The Open AI Team Authors and The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # ...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/image_processing_utils.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/beit/image_processing_beit.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/clip/image_processing_clip.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/deit/image_processing_deit.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/flava/image_processing_flava.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/levit/image_processing_levit.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/mobilenet_v2/image_processing_mobilenet_v2.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/mobilevit/image_processing_mobilevit.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/perceiver/image_processing_perceiver.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/poolformer/image_processing_poolformer.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/segformer/image_processing_segformer.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/videomae/image_processing_videomae.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless r...
1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/rembert/tokenization_rembert.py
# coding=utf-8 # Copyright The HuggingFace Team and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICE...
# coding=utf-8 # Copyright The HuggingFace Team and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICE...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/markuplm/processing_markuplm.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/mobilenet_v2/convert_original_tf_checkpoint_to_pytorch.py
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
# coding=utf-8 # Copyright 2022 The HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicable...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/longt5/modeling_longt5.py
# coding=utf-8 # Copyright 2022 Google LLC., LongT5 Authors and HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # U...
# coding=utf-8 # Copyright 2022 Google LLC., LongT5 Authors and HuggingFace Inc. team. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # U...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./examples/legacy/seq2seq/xla_spawn.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./tests/models/bart/test_modeling_bart.py
# coding=utf-8 # Copyright 2021, The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless ...
# coding=utf-8 # Copyright 2021, The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless ...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./examples/legacy/seq2seq/pack_dataset.py
#!/usr/bin/env python # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless...
#!/usr/bin/env python # Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./tests/models/nystromformer/__init__.py
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/esm/tokenization_esm.py
# coding=utf-8 # Copyright 2022 Meta and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # #...
# coding=utf-8 # Copyright 2022 Meta and The HuggingFace Inc. team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # #...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/commands/train.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/gptj/__init__.py
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2021 The EleutherAI and HuggingFace Teams. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you...
# flake8: noqa # There's no way to ignore "F401 '...' imported but unused" warnings in this # module, but to preserve other warnings. So, don't check this module at all. # Copyright 2021 The EleutherAI and HuggingFace Teams. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/models/x_clip/modeling_x_clip.py
# coding=utf-8 # Copyright 2022 Microsoft Research and The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENS...
# coding=utf-8 # Copyright 2022 Microsoft Research and The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENS...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./tests/models/whisper/test_tokenization_whisper.py
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2022 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1
huggingface/transformers
20,205
Make size_dict conversion logs clearer
# What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to make the logs clearer. ## Before submit...
amyeroberts
2022-11-14T12:52:07Z
2022-11-15T10:52:58Z
f1e8c48c5eebf899a5c79b2c48c0ef8456e6bddc
55ba31908a1216c1767463e3333aa94a6414e6d6
Make size_dict conversion logs clearer. # What does this PR do? * Tidies up logic for converting `size` parameter to the expected dictionary format for image processors. * Adds `param_name` as a flag so logs reflect the variable being updated e.g. `crop_size` versus `size` Address part of #20185 - trying to mak...
./src/transformers/integrations.py
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
# Copyright 2020 The HuggingFace Team. All rights reserved. # # Licensed under the Apache License, Version 2.0 (the "License"); # you may not use this file except in compliance with the License. # You may obtain a copy of the License at # # http://www.apache.org/licenses/LICENSE-2.0 # # Unless required by applicabl...
-1