Skip to content

"언어 모델 기반 사회 시뮬레이션 가능성 검토" (2024 제8회 인문 페스티벌 제8회 인공지능인문학 대학생 학술논문 경연대회 제출용) 학술 논문에 사용된 코드 및 대화 로그 파일 입니다.

License

Notifications You must be signed in to change notification settings

EasternPen9uin/simulation_veil_of_ignorance

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

22 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

AI Town Local Model Setup Guide (simulation_veil_of_ignorance)

This README provides instructions for setting up AI Town to enable interactions between AI models using locally hosted models. The framework allows for simulated AI interactions based on the "Veil of Ignorance" concept. This setup is part of research on the possibilities of social simulation using language models, developed for submission to the 2024 8th Humanities Festival, 8th Artificial Intelligence Humanities University Student Thesis Contest (Awarded 3rd Prize).

  • Title: Artificial Intelligence Humanities Paper Contest
  • Topic: Social Simulation Based on Large Language Model
  • Award: 3rd Prize

Contents

  • Prerequisites
  • Setup Instructions
    • Installing Required Packages
    • Running the Ollama Server
    • Tunneling with Ngrok (optional)
  • Files and Scripts
    • runOllama.ipynb: Colab setup instructions
    • jsonl utilities: Translation and markdown conversion scripts
    • AI Town Code Modifications

Prerequisites

  • Google Colab environment with GPU resources enabled
  • Ngrok (for tunneling, if needed)

Setup Instructions

1. Cloning AI Town (Specific Version)

Recent updates to the a16z-infra/ai-town repository cause compatibility issues with this setup. To ensure stable operation, use the following commands to clone an earlier version of the repository:

git clone https://github.com/a16z-infra/ai-town
cd ai-town
git reset --hard 463b2aae93d11224b880194d4f60c14b3196ccca

This will revert the repository to a version compatible with your configuration. For information on running AI Town, please refer to the README.md file in the repository.

2. Installing Required Packages

From now on, please run all code below in Colab.
Run the following code in Colab to install necessary utilities and packages, including GPU support and Ollama.

# Install necessary GPU tools for Ollama
!sudo apt-get install -y pciutils
!nvidia-smi

# Install Ollama
!curl -fsSL https://ollama.com/install.sh | sh

# Run Ollama in the background
!nohup ollama serve &

3. Loading and Configuring the Mistral 7B Instruct Model

Download and configure the quantized (Q4_K_M) Mistral 7B Instruct v0.2 or OpenHermes model. (Run only one of the two codes below.) :

  • Using Mistral 7B Model
# Download the quantized Mistral 7B Instruct model (Source: https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF)
!curl -L -O https://huggingface.co/TheBloke/Mistral-7B-Instruct-v0.2-GGUF/resolve/main/mistral-7b-instruct-v0.2.Q4_K_M.gguf

# Configure model file for AI Town compatibility
!echo -e "FROM ./mistral-7b-instruct-v0.2.Q4_K_M.gguf\nSYSTEM \"\"\"You must play the role. \nMake sure to use a short sentence within 400 characters.\nAnd make sure to say only one person's line.\"\"\"\nPARAMETER stop \"<s>\"\nPARAMETER stop \"[INST]\"\nPARAMETER stop \"[/INST]\"\nPARAMETER stop \"</s>\"\nTEMPLATE \"\"\"{{ .System }}\n<s>[INST] {{ .Prompt }} [/INST] {{ .Response }} </s>\n\"\"\"" > Modelfile4AITownMistral

# Create the model in Ollama
!ollama create aiTownNPCMistral -f ./Modelfile4AITownMistral
!date
  • Using OpenHermes Model
# Download the quantized OpenHermes 2.5 model using ollama
!ollama pull openhermes

# Configure model file for AI Town compatibility
!echo -e "FROM openhermes\nSYSTEM \"\"\"You must play the role.\nMake sure to use a short sentence within 400 characters.\nAnd make sure to say only one person's line.\nPARAMETER stop \"\"\"<|im_end|>\"\"\"\nPARAMETER stop \"\"\"<|im_start|>\"\"\"\nTEMPLATE \"\"\"<|im_start|>system\n{{ .System }}<|im_end|>\n<|im_start|>user\n{{ .Prompt }}<|im_end|>\n<|im_start|>assistant\"\"\"" > Modelfile4AITownOpenHermes

# Create the model in Ollama
!ollama create aiTownNPC -f ./Modelfile4AITownOpenHermes
!date

Model Testing

To verify the model setup, test it with a simple prompt:

# Test model response
!curl http://localhost:11434/api/generate -d '{"model": "aiTownNPCMistral", "prompt": "I use arch btw", "stream": false}'

4. Tunneling with Ngrok

For AI Town to access the local model hosted in Colab, you may need to set up a tunnel using Ngrok. This step is only necessary if direct port access is restricted.

# Install Ngrok for tunneling
!curl -s https://ngrok-agent.s3.amazonaws.com/ngrok.asc | sudo tee /etc/apt/trusted.gpg.d/ngrok.asc >/dev/null
echo "deb https://ngrok-agent.s3.amazonaws.com buster main" | sudo tee /etc/apt/sources.list.d/ngrok.list
sudo apt update && sudo apt install ngrok

After installation, configure and start the Ngrok tunnel on the specified port:

# Start Ngrok tunnel (example for port 11434)
!ngrok authtoken YOUR_NGROK_AUTH_TOKEN_HERE
!ngrok http 11434 --domain YOUR_NGROK_DOMAIN_HERE.ngrok-free.app --host-header="localhost:11434"

Copy the forwarding URL generated by Ngrok and configure AI Town to access the model using this URL.

Files and Scripts

AI Town Code Modifications

Modified files in ai-town include:

  • characters.ts, constants.ts, conversation.ts: Modify a16z-infra/ai-town options.

jsonl Utilities

  • util-translate.py: Adds machine translations to conversation data using NHNDQ/nllb-finetuned-en2ko model.
  • util-jsonl2markdown: Converts conversation data to markdown format for easier readability.
  • Data Files (gpt.jsonl, mistral.jsonl, openhermes.jsonl): Preprocessed conversation data files.

runOllama.ipynb

This notebook provides all necessary code to install and run local models on Colab, including setting up Ollama and Ngrok.

About

"언어 모델 기반 사회 시뮬레이션 가능성 검토" (2024 제8회 인문 페스티벌 제8회 인공지능인문학 대학생 학술논문 경연대회 제출용) 학술 논문에 사용된 코드 및 대화 로그 파일 입니다.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •