Skip to content

Add EvolvedAttention: A transformer-based neural network strategy for Prisoner's Dilemma #1471

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
wants to merge 8 commits into
base: dev
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
12 changes: 10 additions & 2 deletions axelrod/data/all_classifiers.yml
Original file line number Diff line number Diff line change
@@ -1,12 +1,12 @@
$\phi$:
$\\phi$:
inspects_source: false
long_run_time: false
makes_use_of: !!set {}
manipulates_source: false
manipulates_state: false
memory_depth: .inf
stochastic: false
$\pi$:
$\\pi$:
inspects_source: false
long_run_time: false
makes_use_of: !!set {}
Expand Down Expand Up @@ -439,6 +439,14 @@ Evolved ANN 5 Noise 05:
manipulates_state: false
memory_depth: .inf
stochastic: false
EvolvedAttention:
inspects_source: false
long_run_time: True
makes_use_of: !!set {}
manipulates_source: false
manipulates_state: false
memory_depth: 200
stochastic: false
Evolved FSM 16:
inspects_source: false
long_run_time: false
Expand Down
Binary file added axelrod/data/model_attention.pth
Binary file not shown.
11 changes: 11 additions & 0 deletions axelrod/load_data_.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@
import pkgutil
from typing import Callable, Dict, List, Optional, Tuple

import torch


def axl_filename(path: pathlib.Path) -> pathlib.Path:
"""Given a path under Axelrod/, return absolute filepath.
Expand Down Expand Up @@ -77,3 +79,12 @@ def load_pso_tables(filename="pso_gambler.csv", directory="data"):
values = list(map(float, row[4:]))
d[(name, int(a), int(b), int(c))] = values
return d


def load_attention_model_weights(
filename="model_attention.pth", directory="axelrod/data"
):
"""Load attention model weights."""
path = str(axl_filename(pathlib.Path(directory) / filename))
weights = torch.load(path, map_location=torch.device("cpu"))
return weights
2 changes: 2 additions & 0 deletions axelrod/strategies/_strategies.py
Original file line number Diff line number Diff line change
Expand Up @@ -30,6 +30,7 @@
from .ann import ANN, EvolvableANN # pylint: disable=unused-import
from .apavlov import APavlov2006, APavlov2011
from .appeaser import Appeaser
from .attention import EvolvedAttention
from .averagecopier import AverageCopier, NiceAverageCopier
from .axelrod_first import (
FirstByDavis,
Expand Down Expand Up @@ -348,6 +349,7 @@
EvolvedHMM5,
EvolvedLookerUp1_1_1,
EvolvedLookerUp2_2_2,
EvolvedAttention,
FirmButFair,
FirstByAnonymous,
FirstByDavis,
Expand Down
Loading