Skip to content

Commit ceda7f2

Browse files
committed
Add a canary model (streaming-capable hello-world)
1 parent 0d9a5de commit ceda7f2

File tree

4 files changed

+33
-0
lines changed

4 files changed

+33
-0
lines changed

README.md

Lines changed: 1 addition & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -7,6 +7,7 @@ Once you've got a working model and want to publish it so others can see it in a
77
## Examples in this repo
88

99
- [blur](blur)
10+
- [canary](canary)
1011
- [hello-world](hello-world)
1112
- [notebook](notebook)
1213
- [resnet](resnet)

canary/README.md

Lines changed: 21 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,21 @@
1+
# Canary
2+
3+
This simple model takes a string as input and returns a streaming string output.
4+
5+
## Usage
6+
7+
First, make sure you've got the [latest version of Cog](https://github.com/replicate/cog#install) installed.
8+
9+
Build the container image:
10+
11+
```sh
12+
cog build
13+
```
14+
15+
Now you can run predictions on the model:
16+
17+
```sh
18+
cog predict -i text=Athena
19+
20+
cog predict -i text=Zeus
21+
```

canary/cog.yaml

Lines changed: 3 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,3 @@
1+
build:
2+
python_version: "3.8"
3+
predict: "predict.py:Predictor"

canary/predict.py

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
from cog import BasePredictor, ConcatenateIterator, Input
2+
3+
4+
class Predictor(BasePredictor):
5+
def predict(self, text: str = Input(description="Text to prefix with 'hello there, '")) -> ConcatenateIterator[str]:
6+
yield "hello "
7+
yield "there, "
8+
yield text

0 commit comments

Comments
 (0)