Tech Stack
Architecture
CLI entry point -> Action router (post/engage/scrape/apply) -> Human behavior engine (gaussian delays, sinusoidal scroll, char-by-char typing with typos) -> Playwright browser with stealth patches.Most browser automation gets detected within minutes. Ghost Browser doesn't — because it doesn't behave like a bot.
The Anti-Detection Engine:
human_behavior.py is the core. Every action uses gaussian-distributed delays instead of fixed waits. Scrolling follows sinusoidal curves — humans don't scroll at constant speed. Text input happens character-by-character with occasional typos and backspaces. The viewport randomizes slightly between sessions. Session breaks simulate real browsing fatigue.
LinkedIn Automation:
The linkedin_engage.py CLI handles: posting (with images via Quill editor), feed scraping, comment engagement, job applications, and connection requests. Each action chain includes realistic delays between steps — because clicking "Post" immediately after typing 500 words is a detection signal.
The Scraping Architecture:
universal_scraper.py works on any website with 6 presets (LinkedIn, Indeed, Google Maps, etc.). It extracts structured data, handles pagination, and exports to JSON/CSV. The key insight: scraping slowly with human patterns is more reliable than scraping fast with proxy rotation.
Revenue Applications: LinkedIn automation services: $500-2K/month. Web scraping gigs: $200-1K per project. Job application bots: $50-200 per client. Ghost Browser is open-source on GitHub, but the expertise to deploy and customize it is the real product.
Want to build something like this?
I architect and deploy end-to-end AI systems — from MVP to revenue.
Let's Talk