mariasandu commited on
Commit
f853434
·
verified ·
1 Parent(s): fbb4985

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +8 -3
README.md CHANGED
@@ -17,11 +17,16 @@ It has been trained using [TRL](https://github.com/huggingface/trl).
17
  ## Quick start
18
 
19
  ```python
20
- from transformers import pipeline
 
21
 
22
  question = "Write a Python function that takes a list of numbers and returns the list sorted in ascending order without using the built-in `sorted()` function."
23
- generator = pipeline("text-generation", model="mariasandu/python-coding-assistant-v2", device="cuda")
24
- output = generator([{"role": "user", "content": question}], max_new_tokens=128, return_full_text=False)[0]
 
 
 
 
25
  print(output["generated_text"])
26
  ```
27
 
 
17
  ## Quick start
18
 
19
  ```python
20
+ import torch
21
+ from transformers import pipeline, AutoTokenizer
22
 
23
  question = "Write a Python function that takes a list of numbers and returns the list sorted in ascending order without using the built-in `sorted()` function."
24
+ generator = pipeline("text-generation", model="mariasandu/python-coding-assistant-v2", device="cpu")
25
+
26
+ tokenizer = AutoTokenizer.from_pretrained("mariasandu/python-coding-assistant-v2")
27
+ tokenizer.chat_template = "{% for message in messages %}{% if message['role'] == 'user' %}[INST] {{ message['content'] }} [/INST]{% else %}{{ message['content'] }}{% endif %}{% endfor %}"
28
+ formatted_chat = tokenizer.apply_chat_template(chat, tokenize=True, return_dict=True, continue_final_message=True)
29
+ output = generator(question)
30
  print(output["generated_text"])
31
  ```
32