Skip to content

Commit

Permalink
Merge pull request #1 from noobpk/dev
Browse files Browse the repository at this point in the history
Bump to main
  • Loading branch information
noobpk authored Aug 29, 2023
2 parents 861ce84 + 0730bb9 commit 0a23858
Show file tree
Hide file tree
Showing 12 changed files with 288 additions and 32 deletions.
4 changes: 3 additions & 1 deletion .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -159,4 +159,6 @@ cython_debug/
# option (not recommended) you can uncomment the following to ignore the entire idea folder.
#.idea/
py-env
gemini.keras
gemini.keras
docker-compose.dev.yml
gemini_realtime_predict_req_resp.csv
106 changes: 96 additions & 10 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,19 +1,105 @@
# gemini-predict-serve
A predict serve for gemini-self-protector
# Gemini Predict Serve
This is a prediction module for detecting web application vulnerabilities used for gemini-self-protector

## Deploy Predict Serve with Docker
## Web Application Vulnerabilities Detection

To deploy predict serve using docker, follow these steps -
This is a detection method that using combine Convolutional Neural Network (CNN) and a family of Recurrent Neural Network (RNN) to analyze features and relationships in requests from users and predict whether they are vulnerability or not.

1. Download this `docker-compose.yml` on your local machine or any other system where you have installed Docker. Replace `your-auth-key` with whatever you want. Suggest to use `uuid` or `sha256` for this key.
## Vulnerabilities Detection

- Cross-Site Scripting
- SQL Injection
- Path Traversal (LFI)
- Command Injection
- Remote File Inclusion (RFI)
- Json & XML Injection
- HTML5 Injection
- Server Side Includes (SSI) Injection

## Get this image
Obtain the latest Gemini Predict Serve image by executing the following command:

```
docker pull noobpk/gemini-predict-serve:latest
```

## Launching through the Command Line:
Initiate the Predict Serve with the command line using Docker. Choose the appropriate configuration based on your needs:

For basic usage without Kafka streaming:

```
docker run --name gemini-predict-serve -p 5000:443 --rm -e AUTH_KEY=your-authen-key gemini-predict-serve
```

If you have an Apache Kafka server and want to enable streaming:

```
$ wget -O docker-compose.yml https://raw.githubusercontent.com/noobpk/gemini-self-protector/dev/predict-server/docker-compose.yml
docker run --name gemini-predict-serve -p 5000:443 --rm \
-e AUTH_KEY=your-authen-key \
-e ENABLE_KAFKA_STREAMING=True \
-e KAFKA_BOOTSTRAP_SERVER=your-kafka-server \
-e KAFKA_TOPIC=gemini-data-streaming \
-e KAFKA_USERNAME= \
-e KAFKA_PASSWORD= \
-e KAFKA_SECURITY_PROTOCOL=PLAINTEXT \
gemini-predict-serve
```
2. Open terminal in that directory

3. Run following command to run container
## Simplified Deployment with Docker Compose

For an even more streamlined deployment process, Docker Compose provides a user-friendly alternative:

### 1. Download the Docker Compose File:

Acquire the `docker-compose.yml` file from the repository onto your local machine or any system with Docker installed:

```
$ docker-compose up
```
wget -O docker-compose.yml https://raw.githubusercontent.com/noobpk/gemini-predict-serve/main/docker-compose.yml
```

### 2. Run the Containers:

Navigate to the directory containing the docker-compose.yml file using your terminal and execute the following command:

```
docker-compose up
```

## Configuration

Setup with the Gemini Predict Serve Docker image using the following environment variables:

- `AUTH_KEY` : Authentication key for predict API
- `ENABLE_KAFKA_STREAMING` : Enable send message to kafka. Defaults: False
- `KAFKA_BOOTSTRAP_SERVER` : Kafka server. Example : localhost:9092
- `KAFKA_TOPIC` : Kafka topic. Defaults: gemini-data-streaming
- `KAFKA_USERNAME` : Kafka username
- `KAFKA_PASSWORD` : Kafka password
- `KAFKA_SECURITY_PROTOCOL` : Kafka security protocol. Required

## Ping Pong
```
curl --location 'https://127.0.0.1:5000/ping' --insecure \
--header 'Authorization: your-authen-key'
```

## Predict

```
$ curl --location 'https://127.0.0.1:5000/predict' --insecure \
--header 'Authorization: your-authen-key' \
--header 'Content-Type: application/json' \
--data '{"data":"../../../../etc/passwd"}'
```

## Kafka Extensions

### Real time Predict Plot

![realtime_plot](https://github.com/noobpk/gemini-predict-serve/assets/31820707/f8f4830b-4a8b-4cea-b986-ea843da3782b)

## More About Repository
Github: [gemini-predict-serve](https://github.com/noobpk/gemini-predict-serve)

Image Issues: [Find or create an issues](https://github.com/noobpk/gemini-predict-serve/issues)
2 changes: 1 addition & 1 deletion build-docker-image/.dockerignore
Original file line number Diff line number Diff line change
Expand Up @@ -3,4 +3,4 @@ __pycache__/
.env
docker-compose.yml
Dockerfile
requirements.txt
docker-compose.dev.yml
6 changes: 6 additions & 0 deletions build-docker-image/Dockerfile
Original file line number Diff line number Diff line change
@@ -1,6 +1,12 @@
FROM ubuntu:18.04

ENV AUTH_KEY=
ENV ENABLE_KAFKA_STREAMING=
ENV KAFKA_BOOTSTRAP_SERVER=
ENV KAFKA_TOPIC=
ENV KAFKA_USERNAME=
ENV KAFKA_PASSWORD=
ENV KAFKA_SECURITY_PROTOCOL=

# Install Nginx & openssl
RUN apt-get update -q \
Expand Down
10 changes: 5 additions & 5 deletions build-docker-image/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,16 @@

### Build Image

`docker build -t gemini-web-vuln-detection .`
`docker build -t gemini-predict-serve .`

### Tag and Push Image

`docker tag gemini-web-vuln-detection noobpk/gemini-web-vuln-detection:<version>`
`docker tag gemini-predict-serve noobpk/gemini-predict-serve:<version>`

`docker push noobpk/gemini-web-vuln-detection:<version>`
`docker push noobpk/gemini-predict-serve:<version>`

### Latest Version

`docker tag gemini-web-vuln-detection noobpk/gemini-web-vuln-detection:latest`
`docker tag gemini-predict-serve noobpk/gemini-predict-serve:latest`

`docker push noobpk/gemini-web-vuln-detection:latest`
`docker push noobpk/gemini-predict-serve:latest`
60 changes: 54 additions & 6 deletions build-docker-image/app.py
Original file line number Diff line number Diff line change
Expand Up @@ -2,16 +2,25 @@
import os
from flask import Flask, request, jsonify
from flask_cors import CORS
import tensorflow as tf
from tensorflow.keras.models import load_model
from keras.models import load_model
from sentence_transformers import SentenceTransformer
import numpy as np
from tqdm import tqdm
import json
from datetime import datetime
from waitress import serve
from dotenv import load_dotenv
from kafka import KafkaProducer
import ipaddress

load_dotenv()

AUTH_KEY = os.getenv("AUTH_KEY")
ENABLE_KAFKA_STREAMING = os.getenv("ENABLE_KAFKA_STREAMING")
KAFKA_BOOTSTRAP_SERVER = os.getenv("KAFKA_BOOTSTRAP_SERVER")
KAFKA_TOPIC = os.getenv("KAFKA_TOPIC")
KAFKA_USERNAME = os.getenv("KAFKA_USERNAME")
KAFKA_PASSWORD = os.getenv("KAFKA_PASSWORD")
KAFKA_SECURITY_PROTOCOL = os.getenv("KAFKA_SECURITY_PROTOCOL")

# Init Flask app
app = Flask(__name__)
Expand All @@ -20,6 +29,23 @@

gemini_model = load_model('gemini.keras')

def validate_ip(ip):
try:
ip_obj = ipaddress.ip_address(ip)
return str(ip_obj) # Return the validated IP address
except ValueError:
return "UNKNOWN" # Return "UNKNOWN" for invalid addresses

def kafka_send_message(key, payload):
try:
producer.send(KAFKA_TOPIC, key=key, value=payload)
producer.flush()
except Exception as e:
return jsonify({
"status": "Exception",
"message": "{}".format(e)
})

@app.route('/ping', methods=['GET'])
def server_info():
authorization_header = request.headers.get('Authorization')
Expand All @@ -30,7 +56,8 @@ def server_info():
"vector_size": "384",
"model_build_at": "2023-08-01",
"encoder": "sentence-transformers/all-MiniLM-L6-v2",
"docker_image_version": "1.3",
"docker_image_version": "1.4",
"extension": "kafka",
"author": "noobpk - lethanhphuc"
})
else:
Expand All @@ -52,6 +79,17 @@ def predict():
prediction = gemini_model.predict(encode_input)
accuracy = prediction * 100
accuracy_value = float(accuracy[0][0])
if str(ENABLE_KAFKA_STREAMING) == 'True':
input_ip = request.json['ip']
validated_ip = validate_ip(input_ip)
now = datetime.now()
key = b'time_series'
payload = {
'time': now.strftime('%Y-%m-%d %H:%M:%S'),
'ip': validated_ip,
'score': accuracy_value
}
kafka_send_message(key, payload)
return jsonify({
"status": "Success",
"prediction": input_string,
Expand All @@ -73,8 +111,18 @@ def predict():
})

if __name__ == '__main__':
print("[+] Service Started")
model_name_or_path = os.environ.get(
for i in tqdm(range(1000), colour="green", desc='Encoder Loading'):
model_name_or_path = os.environ.get(
'model_name_or_path', "sentence-transformers/all-MiniLM-L6-v2")
encoder = SentenceTransformer(model_name_or_path=model_name_or_path)
if str(ENABLE_KAFKA_STREAMING) == 'True':
for i in tqdm(range(100), colour="green", desc='Kafka Loading'):
producer = KafkaProducer(
bootstrap_servers = [KAFKA_BOOTSTRAP_SERVER],
sasl_plain_username = KAFKA_USERNAME,
sasl_plain_password = KAFKA_PASSWORD,
security_protocol = KAFKA_SECURITY_PROTOCOL,
value_serializer = lambda v: json.dumps(v).encode('utf-8')
)
print("[+] Serve Started Successfull")
serve(app, host='0.0.0.0', port=5000)
14 changes: 10 additions & 4 deletions build-docker-image/docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,12 +1,18 @@
version: "3"

services:
gemini-web-vuln-detection:
image: gemini-web-vuln-detection
gemini-predict-serve:
image: gemini-predict-serve
build: .
environment:
- AUTH_KEY=your-authen-key
- AUTH_KEY=
- ENABLE_KAFKA_STREAMING=
- KAFKA_BOOTSTRAP_SERVER=
- KAFKA_TOPIC=gemini-data-streaming
- KAFKA_USERNAME=
- KAFKA_PASSWORD=
- KAFKA_SECURITY_PROTOCOL=PLAINTEXT
ports:
- "5000:443"
container_name: gemini-web-vuln-detection
container_name: gemini-predict-serve
restart: unless-stopped
4 changes: 3 additions & 1 deletion build-docker-image/requirements.txt
Original file line number Diff line number Diff line change
Expand Up @@ -5,4 +5,6 @@ numpy==1.23.5
tensorflow==2.13.0
sentence_transformers==2.2.2
waitress==2.1.2
python-dotenv==1.0.0
python-dotenv==1.0.0
kafka-python==2.0.2
tqdm==4.64.1
14 changes: 10 additions & 4 deletions docker-compose.yml
Original file line number Diff line number Diff line change
@@ -1,11 +1,17 @@
version: "3"

services:
gemini-web-vuln-detection:
image: noobpk/gemini-web-vuln-detection
gemini-predict-serve:
image: noobpk/gemini-predict-serve
environment:
- AUTH_KEY="your-authen-key"
- ENABLE_KAFKA_STREAMING=False
- KAFKA_BOOTSTRAP_SERVER=
- KAFKA_TOPIC=gemini-data-streaming
- KAFKA_USERNAME=
- KAFKA_PASSWORD=
- KAFKA_SECURITY_PROTOCOL=PLAINTEXT
ports:
- "3000:443"
container_name: gemini-web-vuln-detection
- "5000:443"
container_name: gemini-predict-serve
restart: unless-stopped
11 changes: 11 additions & 0 deletions kafka-extensions/consumer-basic.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
from kafka import KafkaConsumer

consumer = KafkaConsumer('gemini-data-streaming',
bootstrap_servers=['localhost:9092'],
auto_offset_reset='earliest',
enable_auto_commit=False)

for message in consumer:
print ("%s:%d:%d: key=%s value=%s" % (message.topic, message.partition,
message.offset, message.key,
message.value))
Loading

0 comments on commit 0a23858

Please sign in to comment.