Skip to content

Example code that leverages Attested-TLS protocol (GENXT confido lib) to verify and access Confidential LLM API service.

License

Notifications You must be signed in to change notification settings

genxnetwork/confidential-ai-example

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

15 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

Confidential AI Example

This repository contains a Dockerized client application for a Confidential AI/LLM Service example. This confai-promter.py app leverages the Attested-TLS protocol, implemented in the GENXT confido library. The confido library is a confdential computing middleware that serves as a specialised TLS certificate handler that, using a chain of cryptographically interlinked messages, verifies the authenticity and integrity of the entire software stack of the remote VM running the inference engine. It also independently verifies the CPU-signed evidence of the VM’s memory isolation. Within the Attested-TLS protocol, the CPU implementation is the only explicit trust anchor expected by the user.

Requirements

This example works on any platform and only requires Docker and make to be installed.

Installing Make

On Ubuntu/Debian

You can install make on Ubuntu or any other Debian-based system by running the following command in your terminal:

sudo apt-get install make

On CentOS/RHEL

On CentOS, RHEL, or any other RedHat-based system, you can use the following command:

sudo yum install make

On macOS

If you have Homebrew installed, you can install make by running:

brew install make

If you don't have Homebrew, you can install it by following the instructions on the Homebrew website.

Usage

You can run the following commands using make:

  • make build: This command builds the Docker image with the tag confidential-ai-example. It creates a containerised environment with all the required dependencies, including the confido Attested-TLS library, the httpx standard HTTP client (which supports user-defined trusted TLS certificates), and the open-source OpenAI library for interacting with LLM inference engines.

  • make chat: This command runs the Docker container and executes the confai-promter.py Python script.

The confido library, configured by confai-promter.py, outputs debug messages during the verification of the Attested-TLS Evidence. This evidence includes the Trusted Platform Module (TPM) cryptographic report, as well as GPU and CPU Trusted Execution Environment (TEE) Remote Attestation reports. Notably, all these reports are cryptographically interconnected, and this interconnection is independently verified by the confido client library against CPU and GPU vendors' certificates and well-known software hash-sums.

Example

For an example of the output generated by running the make chat command, refer to the example-output.md file.

About

Example code that leverages Attested-TLS protocol (GENXT confido lib) to verify and access Confidential LLM API service.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published