Skip to content

A powerful Python tool designed to extract API endpoints from web applications, whether they are static files (HTML/JS) or live websites. It combines offline static analysis with dynamic crawling to uncover hidden endpoints, including REST APIs, GraphQL, WebSockets, and redirects.

Notifications You must be signed in to change notification settings

Spiderssh/Webspy

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

5 Commits
 
 
 
 
 
 
 
 

Repository files navigation

Webspy

Author: Spiderssh
Tool: Webspy – Hybrid Endpoint Analyzer for Web Recon


Overview

Webspy is a comprehensive Python tool designed to extract backend endpoints from HTML/JS files or live websites. It combines static analysis with optional asynchronous crawling to uncover REST, GraphQL, WebSockets, redirects, and generic endpoints. It also supports recursive decoding of Base64 and URL-encoded endpoints and can optionally deliver results to Telegram.

Key Features:

  • Analyze local files (HTML, JS) or directories
  • Optional crawling of live websites with --depth
  • Inline JS and external JS detection
  • Recursive Base64 & URL decoding
  • Endpoint categorization: REST, GraphQL, WebSocket, redirects, generic
  • JSON and TXT output
  • Optional Telegram delivery via .env
  • Automatic ffuf and nuclei command generation

Installation & Dependencies

  1. Clone or download the repository.
  2. Ensure Python 3.10+ is installed.
  3. Install all dependencies in one command:

1. Clone the repository

git clone https://github.com/Spiderssh/Webspy.git
cd Webspy

2. Ensure Python 3.10+ is installed

Check your Python version:

python3 --version

3. Install Dependencies

pip install aiohttp beautifulsoup4 python-dotenv requests

Dependencies Explained:

aiohttp – Asynchronous HTTP requests for crawling and fetching

beautifulsoup4 – HTML parsing for extracting JS and links

python-dotenv – Read .env configuration (optional, for Telegram)

requests – Send outputs to Telegram (optional)


Usage

Analyze Local Files or Directories

python3 webspy.py /path/to/files -o endpoints

Crawl a Website Before Analysis

python3 webspy.py https://example.com --depth 2 -o endpoints

Send Results to Telegram

  1. Create a .env file with your bot credentials:
TELEGRAM_BOT_TOKEN=123456:ABCDEF...
TELEGRAM_CHAT_ID=123456789
  1. Run with --send:
python3 webspy.py /path/to/files --send --env .env

CLI Arguments

Argument Description

inputs Files, directories, or a single URL to crawl --base-url Base URL for normalization of absolute URLs -o, --output Output base name (JSON and TXT will use this) --depth Max crawl depth (default: 1) --send Send outputs to Telegram --env Path to .env file for Telegram credentials


Output Structure

JSON Example:

{
  "endpoints": {
    "rest": ["/api/v1/users", "/v2/login"],
    "graphql": ["/graphql"],
    "websocket": ["/ws/updates"],
    "redirects": ["/return_url"],
    "generic": ["/about", "/contact"]
  },
  "all": ["/api/v1/users", "/v2/login", "/graphql", "/ws/updates", "/return_url", "/about", "/contact"],
  "references": {
    "external_js": ["https://cdn.example.com/lib.js"],
    "external_urls": ["https://other.example.com"]
  }
}

TXT Example:

/api/v1/users
/v2/login
/graphql
/ws/updates
/return_url
/about
/contact

Crawling Details

Crawled pages are saved in ./crawled by default.

Only HTML and JS files are analyzed.

Use --depth to control recursion depth for crawling.

Base64 decoding is broad and may detect false positives.

Absolute URLs from other hosts are ignored in normalization.


Telegram Integration

Requires .env file with:

TELEGRAM_BOT_TOKEN=123456:ABCDEF...
TELEGRAM_CHAT_ID=123456789

Sends JSON and TXT results to the specified Telegram chat/bot.

Dependencies: python-dotenv and requests


Analyze local files:

python3 webspy.py ./site_files -o endpoints

Crawl a website:

python3 webspy.py https://example.com --depth 2 -o endpoints

Send to Telegram:

python3 webspy.py ./site_files --send --env .env

Generate ffuf/nuclei commands automatically:

ffuf -u https://TARGET/FUZZ -w endpoints.txt -t 50
nuclei -l endpoints.txt -t ~/nuclei-templates/

Usage Notice

Webspy is intended for legal red team, penetration testing, and OSINT purposes only. Unauthorized use on live systems may be illegal. Use responsibly.


Author: Spiderssh Tool: Webspy

About

A powerful Python tool designed to extract API endpoints from web applications, whether they are static files (HTML/JS) or live websites. It combines offline static analysis with dynamic crawling to uncover hidden endpoints, including REST APIs, GraphQL, WebSockets, and redirects.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages