OpenP5  by agiresearch

Open-source platform for LLM-based recommender systems research

created 2 years ago
308 stars

Top 88.4% on SourcePulse

GitHubView on GitHub
Project Summary

OpenP5 is an open-source platform designed for the development, fine-tuning, and evaluation of Large Language Model (LLM)-based recommender systems. It caters to researchers and practitioners in the field of recommender systems, offering a unified framework to experiment with LLM backbones for recommendation tasks.

How It Works

OpenP5 leverages LLMs, specifically T5 and LLaMA-2, as foundation models for recommendation. It supports various item ID indexing methods and handles both sequential and straightforward recommendation tasks. The platform provides a structured approach to data preparation, model training, and evaluation, aiming to streamline the process of building and assessing LLM-powered recommender systems.

Quick Start & Requirements

  • Installation: Download data from Google Drive and place in ./data. Run sh generate_dataset.sh.
  • Prerequisites: Environment requirements are detailed in .src/src_t5/environment_t5.txt and .src/src_llama/environment_llama.txt. Specific Python versions and libraries will be listed there.
  • Usage: Training commands are in ./command (e.g., sh ML1M_t5_sequential.sh). Evaluation commands are in ./test_command. Checkpoints are available via Google Drive link.
  • Documentation: Paper available at https://arxiv.org/pdf/2203.13366.pdf.

Highlighted Details

  • Supports T5 and LLaMA-2 backbone LLMs.
  • Includes 10 datasets and 3 item ID indexing methods.
  • Handles sequential and straightforward recommendation tasks.
  • Version 2.0 released May 2024.

Maintenance & Community

The project has seen recent releases and updates, indicating active development. The primary contributors are Shuyuan Xu, Wenyue Hua, and Yongfeng Zhang.

Licensing & Compatibility

The README does not explicitly state the license. Compatibility for commercial use or closed-source linking is not specified.

Limitations & Caveats

The project is actively being refactored to unify T5 and LLaMA backbone implementations into a single codebase structure. Specific hardware requirements (e.g., GPU, CUDA versions) are not detailed in the README but are expected to be in the environment files.

Health Check
Last commit

5 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
13 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.