HEBO  by huawei-noah

AI research implementations: Bayesian optimization, RL, generative models

created 3 years ago
2,645 stars

Top 18.2% on sourcepulse

GitHubView on GitHub
Project Summary

This repository provides official implementations for research in Bayesian Optimization, Reinforcement Learning, and Generative Models from Huawei's Noah's Ark Lab. It offers a suite of advanced algorithms and frameworks for tackling complex optimization, decision-making, and generative tasks, benefiting researchers and practitioners in machine learning and artificial intelligence.

How It Works

The library is modular, with distinct sub-directories for Bayesian Optimization (BO), Reinforcement Learning (RL), and Generative Models. BO methods leverage techniques like evolutionary algorithms, neural processes, random decompositions, and deep metric learning to optimize functions efficiently, especially in high-dimensional or combinatorial spaces. RL approaches focus on safe exploration, state augmentation, and model-based learning with pessimism. Generative models explore human-like memory for LLMs and efficient speculative decoding.

Quick Start & Requirements

  • Installation and usage instructions are specific to each sub-project and detailed in their respective README files.
  • Dependencies vary by project but generally include Python and common ML libraries. Specific projects may require GPU acceleration, CUDA, or specialized datasets.
  • Links to official documentation, demos, and specific project codebases are provided within the main README.

Highlighted Details

  • HEBO: Winner of the NeurIPS 2020 Black-Box Optimisation Challenge.
  • MCBO: Framework for combinatorial and mixed-variable Bayesian Optimization, benchmarking 47 novel algorithms.
  • NAP: End-to-end meta-Bayesian optimization with Transformer Neural Processes, achieving state-of-the-art regret.
  • RDUCB: High-dimensional BO with random decompositions, offering a plug-and-play solution.
  • AntBO: Efficient in silico antibody design using combinatorial BO, outperforming genetic algorithms.
  • EM-LLM: Enables LLMs to handle practically infinite context lengths by integrating human episodic memory principles.

Maintenance & Community

  • Developed by Huawei Noah's Ark Lab.
  • Specific community channels or active development status are not detailed in the provided README.

Licensing & Compatibility

  • The MCBO library is explicitly licensed under the MIT license.
  • Other projects' licenses are not specified in the provided text, but compatibility for commercial use would require verification of individual project licenses.

Limitations & Caveats

  • The README indicates that further instructions are provided in individual project READMEs, suggesting a need to navigate multiple sub-directories for full understanding and setup.
  • Specific hardware or software requirements beyond general Python dependencies are not universally listed, requiring per-project checks.
Health Check
Last commit

2 weeks ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
1
Star History
53 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.