Discover and explore top open-source AI tools and projects—updated daily.
karpathyAnalyzing historical discussions with LLMs for prescience
Top 58.6% on SourcePulse
A Hacker News time capsule project that pulls the HN frontpage from exactly 10 years ago, analyzes articles and discussions using an LLM to evaluate prescience with the benefit of hindsight, and generates an HTML report. It aims to identify prescient commenters and explore LLMs' ability to synthesize historical knowledge, serving researchers and enthusiasts interested in long-term trend analysis and online community foresight.
How It Works
The project fetches the Hacker News frontpage from ten years prior, retrieves article content and comments, and then employs a Large Language Model (LLM) to analyze outcomes with hindsight. The LLM grades commenters based on how their statements aged, aggregating these into a "Hall of Fame" to track prediction track records. This approach leverages LLMs to automatically scour historical human discussions and synthesize insights about foresight and prediction.
Quick Start & Requirements
uv sync.env file as OPENAI_API_KEY=your-key-here).uv run python pipeline.py [stage_name] (e.g., all, fetch, analyze). Options include --limit for testing and --date for specific historical analysis. LLM API costs apply during the analyze stage.Highlighted Details
Maintenance & Community
The author explicitly states that "99% of this repo was vibe coded in a few hours with Opus 4.5. Code is provided as is and I don't intend to support it." This indicates a lack of ongoing maintenance or community support.
Licensing & Compatibility
Limitations & Caveats
The project is provided "as is" with no intention of support, reflecting its "vibe coded" nature. Significant costs may be incurred due to extensive LLM API calls during the analysis phase. The accuracy and quality of the analysis are dependent on the LLM's capabilities and the quality of the historical data.
1 month ago
Inactive
NVIDIA-AI-Blueprints
dzhng
stanford-oval