-
Notifications
You must be signed in to change notification settings - Fork 1
Home
Sabsterrexx edited this page Apr 16, 2024
·
2 revisions
The zukiPy module is a Python module designed to interact with the ZukiJourney API, for chat and image interactions using various AI models. (Includes Free GPT-4, Claude, Mistral, and more ...)
To use this modules, as well as gain access to the API, please visit
https://discord.gg/zukijourney
and claim your API key.
import zukiPy
import asyncio
api_key = "" # Get your API key @ https://discord.gg/zukijourney
api_key_backup = "" # Set this to your backup API key (optional, usually for testing with different LLM APIs)
# You can set up your API backup key by calling zukiAI.change_backup_key(api_key_backup)
zukiAI = zukiPy.zukiCall(api_key, "gpt-3.5-turbo")- Parameters:
-
api_key(str): The API key for the ZukiJourney API. -
api_key_backup(str, optional): A backup API key, default is an empty string. -
model(str, optional): The AI model to use for chat interactions, default is "gpt-3.5". -
systemPrompt(str, optional): A system prompt to guide the AI's responses, default is an empty string. -
temperature(str, optional): A value to control the randomness of the AI's responses, default is an empty string.
-
api_endpoint(str): The primary API endpoint for chat completions. -
api_endpoint_unfiltered(str): An unfiltered API endpoint for chat completions. -
api_endpoint_backup(str): A backup API endpoint, defaulting to the WebRaft API. -
systemPrompt(str): The system prompt for guiding AI responses. -
modelsList(list): A list of supported AI models. -
api_key_backup(str): The backup API key. -
model(str): The selected AI model for chat interactions. -
temperature(str): The temperature value for controlling AI response randomness.
Changes the backup API endpoint.
- Parameters:
-
endpoint(str): The new backup API endpoint.
Sets the system prompt for guiding AI responses.
- Parameters:
-
systemprompt(str): The new system prompt.
Sets the temperature value for controlling AI response randomness.
- Parameters:
-
newTemp(float): The new temperature value, must be between 0 and 1.
Sends a chat message using the primary API endpoint.
- Parameters:
-
userName(str): The name of the user sending the message. -
userMessage(str): The message to be sent.
Sends a chat message using the unfiltered API endpoint.
- Parameters:
-
userName(str): The name of the user sending the message. -
userMessage(str): The message to be sent.
Sends a chat message using the backup API endpoint.
- Parameters:
-
userName(str): The name of the user sending the message. -
userMessage(str): The message to be sent.
import zukiPy
import asyncio
api_key = "" # Get your API key @ https://discord.gg/zukijourney
api_key_backup = "" # Set this to your backup API key (optional, usually for testing with different LLM APIs)
# You can set up your API backup key by calling zukiAI.change_backup_key(api_key_backup)
zukiAI = zukiPy.zukiCall(api_key, "gpt-3.5-turbo")
async def main():
chat_response = await zukiAI.zuki_chat.sendMessage("Launchers", "Hello")
print("Chat Response:", chat_response)
image_response = await zukiAI.zuki_image.generateImage("Horror version of Garfield the Cat", 1)
# I'm sorry Jon...
print("\n Image Response:", image_response)
# To call from a backup API endpoint use .sendMessageBackup() and .generateImageBackup()
# You also need to run the async function:
asyncio.run(main())
# Hey, 1_aunchers (creator of this) here. This is made by heavily referencing Sabsterrexx. You can find him @ https://github.com/Sabsterrexx
# WARNING: The default backup endpoint is currently WebRaft, which is currently UNSTABLE. Change your backup endpoint using .change_backup_endpoint() on .zuki_chat
# and .zuki_image
# If you want to use a certain LLM model that isn't supported, you can use zukiAI.zuki_chat.modelsList.append("model-name")- Ensure you have the necessary permissions and API keys to use the ZukiJourney API.
Created by @Sabsterrexx and @1_aunchers