Video-Text-to-Text
Transformers
PyTorch
Safetensors
English
Chinese
llama
text-generation
custom_code
text-generation-inference
Instructions to use KangarooGroup/kangaroo with libraries, inference providers, notebooks, and local apps. Follow these links to get started.
- Libraries
- Transformers
How to use KangarooGroup/kangaroo with Transformers:
# Load model directly from transformers import AutoTokenizer, AutoModelForCausalLM tokenizer = AutoTokenizer.from_pretrained("KangarooGroup/kangaroo", trust_remote_code=True) model = AutoModelForCausalLM.from_pretrained("KangarooGroup/kangaroo", trust_remote_code=True) - Notebooks
- Google Colab
- Kaggle
Update modeling_kangaroo.py
#6
by Jiqing - opened
The transformers has deprecated get_max_length, we should use get_seq_length instead to avoid error in the following:
File "/root/.cache/huggingface/modules/transformers_modules/KangarooGroup/kangaroo/_ed0c00d24ea319ca1cd549bb890dd577f3fed7b/modeling_
kangaroo.py", line 1397, in chat
outputs = self.generate(
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 116, in decorate_context
return func(*args, **kwargs) File "/root/.cache/huggingface/modules/transformers_modules/KangarooGroup/kangaroo/_ed0c00d24ea319ca1cd549bb890dd577f3fed7b/modeling_kangaroo.py", line 1294, in generate return super().generate(inputs_embeds=encoder_input, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/torch/utils/_contextlib.py", line 116, in decorate_context return func(*args, **kwargs)
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 2223, in generate
result = self._sample(
File "/usr/local/lib/python3.10/dist-packages/transformers/generation/utils.py", line 3204, in _sample
model_inputs = self.prepare_inputs_for_generation(input_ids, **model_kwargs)
File "/root/.cache/huggingface/modules/transformers_modules/KangarooGroup/kangaroo/_ed0c00d24ea319ca1cd549bb890dd577f3fed7b/modeling_
kangaroo.py", line 1312, in prepare_inputs_for_generation if past_key_values.get_max_length() is not None
File "/usr/local/lib/python3.10/dist-packages/torch/nn/modules/module.py", line 1928, in __getattr__
raise AttributeError( AttributeError: 'DynamicCache' object has no attribute 'get_max_length'. Did you mean: 'get_seq_length'?