[PROTOCOL:human-AI|v=2.0|compress=40-65%|think=compressed|workflow=chain]=>[OUT]
Like HTTP standardized the web, I-Lang standardizes how humans talk to AI. One protocol, every platform, no vendor lock-in. Open source, free forever.
ilang.ai · ilang.cn · research.ilang.ai · i.ilang.ai
| Product | What it does | Install |
|---|---|---|
| AutoCode | 39 auto-activated skills for Claude Code. Say what you want, get a product. | 帮我安装autocode插件,地址 github.com/ilang-ai/autocode |
| ZeroCode | 40 Chinese skills for Trae / VS Code. Zero code, zero config, zero English. | Download ZIP or VS Code Marketplace |
| AI See | Give your AI eyes. i.ilang.ai/https://any-url then paste into any AI conversation. |
No install. Just paste the URL. |
| OpenClaw Skills | Token compression, AI-to-AI prompting, universal upgrade. ClawHub verified. | Via ClawHub |
Three layers. Copy, paste into any AI, done.
Transmission compress human-AI communication (40-65% token savings)
Thinking AI compresses its own internal reasoning (invisible, always active)
Workflow multi-step tasks as single executable chains
30 seconds to start: Copy the protocol from ilang.ai, paste into any AI. No SDK, no API key, no installation.
ChatGPT · Claude · Gemini · DeepSeek · Kimi · 豆包 · 元宝. All tested, all verified.
I-Lang Research explores how structured communication protocols reduce AI hallucination in high-stakes domains including clinical AI, psychiatric assessment, and medical diagnostics.
Published papers:
| Paper | Status | Links |
|---|---|---|
| The Inductive Dilemma of AI Hallucination | Published | ResearchGate · SSRN · ChinaXiv |
| Selective Forgetting Algorithm | In progress | |
| AI-Era Cryptography (Honesty Paradox) | In progress | |
| Cross-Base Genetic Expression of AI Personality | Planned |
ORCID: 0009-0004-4540-8082
Research site: research.ilang.ai
| Repo | Description |
|---|---|
| autocode | 39 AI coding skills for Claude Code / Codex / OpenCode |
| trae | ZeroCode: 40 Chinese AI coding skills for Trae / VS Code |
| ilang.ai | Protocol website (international) |
| ilang-openclaw | OpenClaw skill packages |
| ilang-dict | Public dictionary: 52 verbs, 28 modifiers, 14 entities |
| ilang-research | Academic papers and protocol specification |
| ilang-spec | Protocol specification |
Before (67 words):
Please read the document I uploaded, extract all the key points and important data, then organize them into a professional summary with bullet points in Markdown format...
After (1 line):
[READ:@FILE]=>[FILT|key=important]=>[SUM|sty=bullets,ton=pro,fmt=md]=>[OUT]
Before (22 words):
Please translate this text into English and then format the output as Markdown.
After (1 line):
[TRANSLATE|lang=en]=>[FMT|fmt=md]=>[OUT]
40-65% fewer tokens. Same result. Works on every AI.
Sites and products built with I-Lang protocol-layer thinking:
| Product | What it does |
|---|---|
| ffp.news | AI-powered frequent flyer intelligence |
| hotelcorporate.codes | Hotel corporate code database with AI editorial |
I-Lang Research · Eastsoft Inc. · Canada · 2026 · MIT