notebooklm-mcp  by PleasePrompto

Grounded AI agent research via NotebookLM

Created 2 months ago
380 stars

Top 75.1% on SourcePulse

GitHubView on GitHub
Project Summary

This project provides a Model Context Protocol (MCP) server that bridges AI agents like Claude Code, Cursor, and Codex with Google's NotebookLM. It addresses the common problems of AI hallucination, inaccurate retrieval, and excessive token consumption when agents attempt to research local documentation, offering a solution for developers and researchers to obtain grounded, citation-backed answers directly within their workflow. The primary benefit is enabling AI agents to perform autonomous, deep research on user-provided knowledge bases, leading to more accurate code generation and faster development cycles.

How It Works

The notebooklm-mcp server acts as an intermediary, leveraging browser automation to interact with Google's NotebookLM service. AI agents send queries via the MCP, which are then processed by NotebookLM, powered by Gemini 2.5. NotebookLM synthesizes answers exclusively from user-uploaded documents (PDFs, websites, videos, etc.), ensuring zero hallucinations and providing source citations. This approach bypasses the need for complex local RAG setups, offering minimal token costs, rapid setup, and a more reliable information retrieval mechanism compared to direct file searching or generic web searches.

Quick Start & Requirements

  • Primary install: npx notebooklm-mcp@latest
  • Prerequisites: Node.js (for npx), Google account, Chrome browser.
  • Estimated setup time: 5 minutes.
  • Relevant links: NotebookLM: notebooklm.google.com

Highlighted Details

  • Zero Hallucinations: NotebookLM refuses to answer if information is not present in the uploaded documents.
  • Autonomous Research: AI agents automatically ask sequential follow-up questions to build comprehensive understanding.
  • Smart Library Management: Save NotebookLM links with tags; agents intelligently select relevant notebooks for tasks.
  • Cross-Tool Sharing: Persistent sessions and a shared library work across various MCP clients like Claude Code, Codex, and Cursor.
  • Pre-processed Knowledge Base: Documents are processed by Gemini once, eliminating the need for local embeddings or vector databases.

Maintenance & Community

The project is maintained by "PleasePrompto" with contributions welcomed via GitHub issues and pull requests. Specific community channels like Discord or Slack are not detailed in the README.

Licensing & Compatibility

  • License: MIT.
  • Compatibility: The MIT license permits free use, modification, and distribution, including in commercial and closed-source projects.

Limitations & Caveats

The project relies on browser automation, and there's a possibility Google may detect automated usage; using a dedicated Google account is recommended. Furthermore, while the system aims for accuracy, the underlying AI agents (Claude, Codex) can still make mistakes, necessitating user review of changes, testing in safe environments, and maintaining backups. The author disclaims responsibility for potential data loss or account issues.

Health Check
Last Commit

2 weeks ago

Responsiveness

Inactive

Pull Requests (30d)
1
Issues (30d)
7
Star History
160 stars in the last 30 days

Explore Similar Projects

Feedback? Help us improve.