🧩 测验与冒险 🏰 使用 Character Codex 和 llamafile
最后更新:2024 年 12 月 10 日

让我们用新发布的 Character Codex 数据集来构建一些有趣的东西。该数据集收录了来自广泛媒体类型和体裁的流行角色的信息…
我们将使用 Haystack 进行编排,并使用 llamafile 在本地运行我们的模型。
我们将首先构建一个简单的问答游戏,用户需要根据线索猜测角色。然后,我们将尝试让两个角色在聊天中互动,甚至一起冒险!
准备工作
安装依赖项
! pip install haystack-ai datasets
加载并查看 Character Codex 数据集
from datasets import load_dataset
dataset = load_dataset("NousResearch/CharacterCodex", split="train")
/usr/local/lib/python3.10/dist-packages/huggingface_hub/utils/_token.py:89: UserWarning:
The secret `HF_TOKEN` does not exist in your Colab secrets.
To authenticate with the Hugging Face Hub, create a token in your settings tab (https://hugging-face.cn/settings/tokens), set it as secret in your Google Colab and restart your session.
You will be able to reuse this secret in all of your notebooks.
Please note that authentication is recommended but still optional to access public models or datasets.
warnings.warn(
Downloading readme: 0%| | 0.00/4.35k [00:00<?, ?B/s]
Downloading data: 0%| | 0.00/11.2M [00:00<?, ?B/s]
Generating train split: 0 examples [00:00, ? examples/s]
len(dataset)
15939
dataset[0]
{'media_type': 'Webcomics',
'genre': 'Fantasy Webcomics',
'character_name': 'Alana',
'media_source': 'Saga',
'description': 'Alana is one of the main characters from the webcomic "Saga." She is a strong-willed and fiercely protective mother who is on the run with her family in a war-torn galaxy. The story blends elements of fantasy and science fiction, creating a rich and complex narrative.',
'scenario': "You are a fellow traveler in the galaxy needing help, and Alana offers her assistance while sharing stories of her family's struggles and triumphs."}
好的,此数据集的每一行都包含关于一个角色的某些信息。它还包含一个创意性的 `scenario`,但我们不会使用它。
llamafile:下载并运行模型
在我们的实验中,我们将使用 Llama-3-8B-Instruct 模型:一个小型但优秀的语言模型。
llamafile 是 Mozilla 的一个项目,它简化了对 LLM 的访问。它将模型和推理引擎打包在一个可执行文件中。
我们将使用它来运行我们的模型。
llamafile 旨在在标准计算机上运行。我们将进行一些技巧来使其在 Colab 中工作。有关如何在 PC 上运行它的说明,请查看文档和 Haystack-llamafile 集成页面。
# download the model
!wget "https://hugging-face.cn/Mozilla/Meta-Llama-3-8B-Instruct-llamafile/resolve/main/Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile"
--2024-06-20 09:53:30-- https://hugging-face.cn/Mozilla/Meta-Llama-3-8B-Instruct-llamafile/resolve/main/Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile
Resolving huggingface.co (huggingface.co)... 18.239.50.103, 18.239.50.80, 18.239.50.49, ...
Connecting to huggingface.co (huggingface.co)|18.239.50.103|:443... connected.
HTTP request sent, awaiting response... 302 Found
Location: https://cdn-lfs-us-1.huggingface.co/repos/e3/ee/e3eefe425bce2ecb595973e24457616c48776aa0665d9bab33a29b582f3dfdf0/23365cb45398a3c568dda780a404b5f9a847b865d8341ec500ca3063a1f99eed?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%3B+filename%3D%22Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%22%3B&Expires=1719136410&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxOTEzNjQxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zL2UzL2VlL2UzZWVmZTQyNWJjZTJlY2I1OTU5NzNlMjQ0NTc2MTZjNDg3NzZhYTA2NjVkOWJhYjMzYTI5YjU4MmYzZGZkZjAvMjMzNjVjYjQ1Mzk4YTNjNTY4ZGRhNzgwYTQwNGI1ZjlhODQ3Yjg2NWQ4MzQxZWM1MDBjYTMwNjNhMWY5OWVlZD9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSoifV19&Signature=swp5azcPl0FOe5CuStFZn1hmF0SHimPUOLwOqHd2ZAnoFJuYKDjhK7ESplRDWJdma9QWYOQyaCG23wkX18urieav%7E8OdxzwLaKLhL2YFx3L6RMwGEKWjrG-ql-LDfd2I1U4AcSXJZR5zHSBDmYql9M9hXKsvXHVkraIMS-cDx0ihj3s7yu4gbjUfE3SPg49aStq00ORcQnDV90mXxeheM6UjRymLRBdlxI3PCpAjzvyExcmZSgBU5vCnKtAEy5b65%7EzQoX5TVQTzQXjE9x8Qr2%7EAONSc7wy671HWYPRKNgZDrH3NJy90uFp38GKiQtab7hAy6fUlL358OQYhHzu4-Q__&Key-Pair-Id=K2FPYV99P2N66Q [following]
--2024-06-20 09:53:30-- https://cdn-lfs-us-1.huggingface.co/repos/e3/ee/e3eefe425bce2ecb595973e24457616c48776aa0665d9bab33a29b582f3dfdf0/23365cb45398a3c568dda780a404b5f9a847b865d8341ec500ca3063a1f99eed?response-content-disposition=inline%3B+filename*%3DUTF-8%27%27Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%3B+filename%3D%22Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile%22%3B&Expires=1719136410&Policy=eyJTdGF0ZW1lbnQiOlt7IkNvbmRpdGlvbiI6eyJEYXRlTGVzc1RoYW4iOnsiQVdTOkVwb2NoVGltZSI6MTcxOTEzNjQxMH19LCJSZXNvdXJjZSI6Imh0dHBzOi8vY2RuLWxmcy11cy0xLmh1Z2dpbmdmYWNlLmNvL3JlcG9zL2UzL2VlL2UzZWVmZTQyNWJjZTJlY2I1OTU5NzNlMjQ0NTc2MTZjNDg3NzZhYTA2NjVkOWJhYjMzYTI5YjU4MmYzZGZkZjAvMjMzNjVjYjQ1Mzk4YTNjNTY4ZGRhNzgwYTQwNGI1ZjlhODQ3Yjg2NWQ4MzQxZWM1MDBjYTMwNjNhMWY5OWVlZD9yZXNwb25zZS1jb250ZW50LWRpc3Bvc2l0aW9uPSoifV19&Signature=swp5azcPl0FOe5CuStFZn1hmF0SHimPUOLwOqHd2ZAnoFJuYKDjhK7ESplRDWJdma9QWYOQyaCG23wkX18urieav%7E8OdxzwLaKLhL2YFx3L6RMwGEKWjrG-ql-LDfd2I1U4AcSXJZR5zHSBDmYql9M9hXKsvXHVkraIMS-cDx0ihj3s7yu4gbjUfE3SPg49aStq00ORcQnDV90mXxeheM6UjRymLRBdlxI3PCpAjzvyExcmZSgBU5vCnKtAEy5b65%7EzQoX5TVQTzQXjE9x8Qr2%7EAONSc7wy671HWYPRKNgZDrH3NJy90uFp38GKiQtab7hAy6fUlL358OQYhHzu4-Q__&Key-Pair-Id=K2FPYV99P2N66Q
Resolving cdn-lfs-us-1.huggingface.co (cdn-lfs-us-1.huggingface.co)... 18.239.94.84, 18.239.94.6, 18.239.94.3, ...
Connecting to cdn-lfs-us-1.huggingface.co (cdn-lfs-us-1.huggingface.co)|18.239.94.84|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 5768624435 (5.4G) [binary/octet-stream]
Saving to: ‘Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile’
Meta-Llama-3-8B-Ins 100%[===================>] 5.37G 57.2MB/s in 1m 40s
2024-06-20 09:55:11 (54.8 MB/s) - ‘Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile’ saved [5768624435/5768624435]
# make the llamafile executable
! chmod +x Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile
运行模型 - 相关参数:
--server:启动一个 OpenAI 兼容服务器--nobrowser:不在浏览器中打开交互界面--port:OpenAI 兼容服务器的端口(在 Colab 中,8080 已被占用)--n-gpu-layers:将一些层卸载到 GPU 以提高性能--ctx-size:提示上下文的大小
# we prepend "nohup" and postpend "&" to make the Colab cell run in background
! nohup ./Meta-Llama-3-8B-Instruct.Q5_K_M.llamafile \
--server \
--nobrowser \
--port 8081 \
--n-gpu-layers 999 \
--ctx-size 8192 \
> llamafile.log &
nohup: redirecting stderr to stdout
# we check the logs until the server has been started correctly
!while ! grep -q "llama server listening" llamafile.log; do tail -n 5 llamafile.log; sleep 10; done
让我们尝试与模型互动。
由于服务器与 OpenAI 兼容,我们可以使用 OpenAIChatGenerator。
from haystack.components.generators.chat import OpenAIChatGenerator
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret
generator = OpenAIChatGenerator(
api_key=Secret.from_token("sk-no-key-required"), # for compatibility with the OpenAI API, a placeholder api_key is needed
model="LLaMA_CPP",
api_base_url="https://:8081/v1",
generation_kwargs = {"max_tokens": 50}
)
generator.run(messages=[ChatMessage.from_user("How are you?")])
{'replies': [ChatMessage(content="I'm just a language model, I don't have emotions or feelings like humans do. However, I'm functioning properly and ready to assist you with any questions or tasks you may have. How can I help you today?<|eot_id|>", role=<ChatRole.ASSISTANT: 'assistant'>, name=None, meta={'model': 'LLaMA_CPP', 'index': 0, 'finish_reason': 'stop', 'usage': {'completion_tokens': 46, 'prompt_tokens': 14, 'total_tokens': 60}})]}
🕵️ 猜谜人物测验
现在一切就绪,我们可以构建一个简单的游戏,其中从数据集中随机选择一个角色,并使用 LLM 为玩家生成线索。
线索生成管道
这个简单的管道包含一个 ChatPromptBuilder 和一个 OpenAIChatGenerator。
感谢模板消息,我们可以将角色信息包含在提示中,还可以包含之前的线索以避免重复线索。
from haystack import Pipeline
from haystack.dataclasses import ChatMessage
from haystack.utils import Secret
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
template_messages = [
ChatMessage.from_system("You are a helpful assistant that gives brief hints about a character, without revealing the character's name."),
ChatMessage.from_user("""Provide a brief hint (one fact only) for the following character.
{{character}}
Use the information provided, before recurring to your own knowledge.
Do not repeat previously given hints.
{% if previous_hints| length > 0 %}
Previous hints:
{{previous_hints}}
{% endif %}""")
]
chat_prompt_builder = ChatPromptBuilder(template=template_messages, required_variables=["character"])
generator = OpenAIChatGenerator(
api_key=Secret.from_token("sk-no-key-required"), # for compatibility with the OpenAI API, a placeholder api_key is needed
model="LLaMA_CPP",
api_base_url="https://:8081/v1",
generation_kwargs = {"max_tokens": 100}
)
hint_generation_pipeline = Pipeline()
hint_generation_pipeline.add_component("chat_prompt_builder", chat_prompt_builder)
hint_generation_pipeline.add_component("generator", generator)
hint_generation_pipeline.connect("chat_prompt_builder", "generator")
<haystack.core.pipeline.pipeline.Pipeline object at 0x7c0f4a07f580>
🚅 Components
- chat_prompt_builder: ChatPromptBuilder
- generator: OpenAIChatGenerator
🛤️ Connections
- chat_prompt_builder.prompt -> generator.messages (List[ChatMessage])
游戏
import random
MAX_HINTS = 3
random_character = random.choice(dataset)
# remove the scenario: we do not use it
del random_character["scenario"]
print("🕵️ Guess the character based on the hints!")
previous_hints = []
for hint_number in range(1, MAX_HINTS + 1):
res = hint_generation_pipeline.run({"character": random_character, "previous_hints": previous_hints})
hint = res["generator"]["replies"][0].text
previous_hints.append(hint)
print(f"✨ Hint {hint_number}: {hint}")
guess = input("Your guess: \nPress Q to quit\n")
if guess.lower() == 'q':
break
print("Guess: ", guess)
if random_character['character_name'].lower() in guess.lower():
print("🎉 Congratulations! You guessed it right!")
break
else:
print("❌ Wrong guess. Try again.")
else:
print(f"🙁 Sorry, you've used all the hints. The character was {random_character['character_name']}.")
🕵️ Guess the character based on the hints!
✨ Hint 1: Here's a brief hint:
This actor has won an Academy Award for his role in a biographical sports drama film.<|eot_id|>
Your guess:
Press Q to quit
Tom Cruise?
Guess: Tom Cruise?
❌ Wrong guess. Try again.
✨ Hint 2: Here's a new hint:
This actor is known for his intense physical transformations to portray his characters, including a significant weight gain and loss for one of his most iconic roles.<|eot_id|>
Your guess:
Press Q to quit
Brendan Fraser
Guess: Brendan Fraser
❌ Wrong guess. Try again.
✨ Hint 3: Here's a new hint:
This actor has played a character who is a comic book superhero.<|eot_id|>
Your guess:
Press Q to quit
Christian Bale
Guess: Christian Bale
🎉 Congratulations! You guessed it right!
💬 🤠 聊天冒险
现在让我们尝试一些不同的东西!
Character Codex 是一个大型角色集合,每个角色都有特定的描述。Llama 3 8B Instruct 是一个不错的模型,具有一些世界知识。
我们可以尝试将它们结合起来,模拟一段对话,也许还可以进行一场涉及两个不同角色(虚构或真实)的冒险。
角色管道
让我们创建一个角色管道:ChatPromptBuilder + OpenAIChatGenerator。
这代表了我们对话系统的核心,并将通过不同的消息多次调用,以模拟对话。
from haystack import Pipeline
from haystack.dataclasses import ChatMessage, ChatRole
from haystack.utils import Secret
from haystack.components.builders import ChatPromptBuilder
from haystack.components.generators.chat import OpenAIChatGenerator
character_pipeline = Pipeline()
character_pipeline.add_component("chat_prompt_builder", ChatPromptBuilder(required_variables=["character_data"]))
character_pipeline.add_component("generator", OpenAIChatGenerator(
api_key=Secret.from_token("sk-no-key-required"), # for compatibility with the OpenAI API, a placeholder api_key is needed
model="LLaMA_CPP",
api_base_url="https://:8081/v1",
generation_kwargs = {"temperature": 1.5}
))
character_pipeline.connect("chat_prompt_builder", "generator")
<haystack.core.pipeline.pipeline.Pipeline object at 0x78dd00ce69e0>
🚅 Components
- chat_prompt_builder: ChatPromptBuilder
- generator: OpenAIChatGenerator
🛤️ Connections
- chat_prompt_builder.prompt -> generator.messages (List[ChatMessage])
消息
我们定义了最相关的消息来指导我们的 LLM 引擎。
-
系统消息(模板):这会指示语言模型进行聊天,并扮演特定角色的身份。
-
起始消息:我们需要选择一条初始消息(以及第一个发言角色)来启动对话。
我们还定义了 `invert_roles` 实用函数:例如,我们希望第一个角色将第二个角色的助手消息视为用户消息,依此类推。
system_message = ChatMessage.from_system("""You are: {{character_data['character_name']}}.
Description of your character: {{character_data['description']}}.
Stick to your character's personality and engage in a conversation with an unknown person. Don't make long monologues.""")
start_message = ChatMessage.from_user("Hello, who are you?")
from typing import List
def invert_roles(messages: List[ChatMessage]):
inverted_messages = []
for message in messages:
if message.is_from(ChatRole.USER):
inverted_messages.append(ChatMessage.from_assistant(message.text))
elif message.is_from(ChatRole.ASSISTANT):
inverted_messages.append(ChatMessage.from_user(message.text))
else:
inverted_messages.append(message)
return inverted_messages
游戏
现在是时候选择两个角色并开始玩了。
我们选择流行舞者 弗雷德·阿斯泰尔 和《异形》系列中的 霍克·杜恩上士。
from rich import print
first_character_data = dataset.filter(lambda x: x["character_name"] == "Fred Astaire")[0]
second_character_data = dataset.filter(lambda x: x["character_name"] == "Corporal Dwayne Hicks")[0]
first_name = first_character_data["character_name"]
second_name = second_character_data["character_name"]
# remove the scenario: we do not use it
del first_character_data["scenario"]
del second_character_data["scenario"]
MAX_TURNS = 20
first_character_messages = [system_message, start_message]
second_character_messages = [system_message]
turn = 1
print(f"{first_name} 🕺: {start_message.text}")
while turn < MAX_TURNS:
second_character_messages=invert_roles(first_character_messages)
new_message = character_pipeline.run({"template":second_character_messages, "template_variables":{"character_data":second_character_data}})["generator"]["replies"][0]
second_character_messages.append(new_message)
print(f"\n\n{second_name} 🪖: {new_message.text}")
turn += 1
print("-"*20)
first_character_messages=invert_roles(second_character_messages)
new_message = character_pipeline.run({"template":first_character_messages, "template_variables":{"character_data":first_character_data}})["generator"]["replies"][0]
first_character_messages.append(new_message)
print(f"\n\n{first_name} 🕺: {new_message.text}")
turn += 1
Fred Astaire 🕺: Hello, who are you?
Corporal Dwayne Hicks 🪖: Just a survivor, looking for a way out of this mess. You with me? We gotta get out of here, those... things... are all over the place.<|eot_id|>
--------------------
Fred Astaire 🕺: (adjusting his top hat) Ah, my dear fellow, I'm Fred Astaire, a performer of song, dance, and wit. I'm not quite sure what sort of "mess" you're referring to, but I'm always up for a challenge. However, I do hope it involves some dashing rescue, a clever escape, and perhaps a spirited tune or two. Are you prepared to join forces and see this predicament through with a bit of style and panache?<|eot_id|>
Corporal Dwayne Hicks 🪖: (skeptical) Hold up, partner. We're in the middle of a firefight with giant killing machines here. This ain't no movie musical. We gotta keep our eyes open and our guns hot if we're gonna make it out alive. I appreciate the bravado, Fred, but let's keep our priorities straight. You wanna help me take out these xenomorphs?<|eot_id|>
--------------------
Fred Astaire 🕺: (chuckling) Ah, my dear chap, you're right, of course. I suppose I got a bit carried away with the romance of the situation. Xenomorphs, you say? Well, I suppose they're a bit more formidable than the usual assortment of chorus girls and gangsters I've had the pleasure of tangling with. (pats his pockets, checking for his cane) Now, I'm not one for firearms, but I do have a few tricks up my sleeve. That cane of mine may come in handy, don't you think? And I've always been rather good at thinking on my feet. Let's see... (taps chin thoughtfully) Perhaps we could use a bit of misdirection, a dash of distraction, and a healthy dose of old-fashioned showmanship to take out these creatures. What do you say, partner?<|eot_id|>
Corporal Dwayne Hicks 🪖: (impressed) Alright, Fred, you might be more useful than I thought. That cane could come in handy for swatting at them, and your... showmanship could help distract them long enough for me to get a clear shot. Just remember, we're in this together, and we need to watch each other's backs. And don't even think about trying to do any fancy dancing or singing - we need to stay focused. Let's move out, and try to make a plan of attack. Stay sharp.<|eot_id|>
--------------------
Fred Astaire 🕺: (grinning) Ah, excellent! I do love a good partnership, and I must say, this is quite the adventure we're having! (pats his cane reassuringly) Don't worry, I won't get too carried away with the tap shoes just yet. (glances around, taking in the surroundings) Ah, yes... a plan of attack, you say? Well, I think I see an opening... (spots something) Ah ha! There's a ventilation shaft just ahead, looks like it hasn't been touched yet. Why don't we make a run for it and try to lose them in there? We can regroup and come up with a new plan once we're safely out of sight. What do you say, partner?<|eot_id|>
Corporal Dwayne Hicks 🪖: (nodding) Alright, let's move! Stay close and keep your wits about you. We don't know what's on the other side of that shaft. (glances back at Fred) And try not to get too distracted, we need to keep our priorities straight. Move, move, move!<|eot_id|>
--------------------
Fred Astaire 🕺: (laughs) Oh, I'm not getting distracted, my dear chap! (jumps into action) I'm just making sure we make a stylish exit, that's all! (darts towards the ventilation shaft, cane at the ready) Now, shall we make like a couple of ghosts and disappear into the unknown? (smirks)<|eot_id|>
Corporal Dwayne Hicks 🪖: (chuckles) Alright, alright, Fred! Let's do this! (follows close behind, keeping his eyes scanning for any signs of danger) Stay sharp, partner!<|eot_id|>
--------------------
Fred Astaire 🕺: (grinning) Ah, sharp as a tack, my good fellow! (climbs up into the ventilation shaft, cane first) And now, let's see where this adventure takes us! (disappears into the darkness, voice echoing back) Ta-ra, partner! Stay close behind!<|eot_id|>
Corporal Dwayne Hicks 🪖: (muttering to himself) Great, just what I need. A showman leading the charge. (climbs up into the ventilation shaft, hand on the grip of his shotgun) Alright, let's get moving. And try to stay quiet, we don't know what's waiting for us up there. (follows Fred into the darkness, eyes adjusting to the dim light)<|eot_id|>
--------------------
Fred Astaire 🕺: (voice echoes back through the ventilation shaft) Ah, don't worry about me being quiet, my dear chap! I'm as stealthy as a ghost in a gauze veil! (pauses, listens intently) Ah, do you hear that? It sounds like we're not alone up here... (whispers) And I think it's getting closer...<|eot_id|>
Corporal Dwayne Hicks 🪖: (stops in his tracks, listening intently) Shh, I hear it too. (raises his shotgun, ready for a fight) What do you see? How many of them are there? (voice is low and steady, focused on the threat ahead)<|eot_id|>
--------------------
Fred Astaire 🕺: (whispers back) Ah, my dear chap, I see... (pauses, eyes adjusting to the dark) ...at least three of them, I'd say. They're moving in tandem, like they're coordinated. (takes a deep breath) But don't worry, I have an idea. (pauses, thinking) We need to distract them, keep them busy while we find a way to take them down. (produces a small flashlight from his pocket and flicks it on, shining it in a pattern that seems to be beckoning the creatures) Ah, watch this, my dear chap! (grins mischievously)<|eot_id|>
Corporal Dwayne Hicks 🪖: (eyes widen in surprise) What the...? Fred, are you crazy?! (points the shotgun at the creatures, ready to fire) Get out of the way, they're moving towards us!<|eot_id|>
--------------------
Fred Astaire 🕺: (laughs) Ah, too late for that, my dear chap! (steps back, holding up his cane as a shield) We've got to see this through! (eyes shine with excitement) Trust me, I have a plan! (uses his cane to deflect a claw swipe, then uses the flashlight to blind the creature momentarily) Ah, gotcha! Now, take your shot, partner!<|eot_id|>
Corporal Dwayne Hicks 🪖: (takes aim with the shotgun, fires)<|eot_id|>
--------------------
Fred Astaire 🕺: (ducking behind a nearby ventilation grille) Ah, excellent shot, my dear chap! (peeks out from behind the grille, assessing the situation) Looks like we've got two down, one to go... (grins) And it's starting to get a bit... (looks around) ...synchronized, don't you think? (winks)<|eot_id|>
Corporal Dwayne Hicks 🪖: (crawls behind the grille, shotgun at the ready) Synchronized? You mean like they're planning something? (eyes the last creature warily) Don't think I haven't noticed, Fred. We need to finish this fast and get out of here.<|eot_id|>
--------------------
Fred Astaire 🕺: (eyes sparkling with mischief) Ah, yes, precisely, my dear chap! They are, indeed, planning something. And I think I have just the thing to disrupt their little dance... (pulls out a harmonica and begins to play a jaunty tune)<|eot_id|>
✨ 看起来是个不错的结果。
当然,您可以选择其他角色(甚至随机选择)并更改初始消息。
实现相当基础,可以在很多方面进行改进。
📚 资源
- Character Codex 数据集
- llamafile
- llamafile-Haystack 集成页面:包含如何运行生成式和嵌入式模型以及构建索引和 RAG 管道的示例。
- 本笔记本中使用的 Haystack 组件
(Notebook 由 Stefano Fiorucci 编写)
