aidapal  by atredispartners

IDA Pro plugin for code analysis

Created 1 year ago
337 stars

Top 81.6% on SourcePulse

GitHubView on GitHub
Project Summary

aiDAPal is an IDA Pro plugin designed to enhance reverse engineering workflows by integrating a locally hosted Large Language Model (LLM) fine-tuned for Hex-Rays pseudocode. It assists analysts by providing context-aware code suggestions and analysis directly within the IDA Pro environment, aiming to accelerate the reverse engineering process.

How It Works

The plugin leverages Ollama's API to interact with a specialized LLM, aidapal-8k.Q4_K_M.gguf, fine-tuned on Hex-Rays pseudocode. This approach allows for local, private LLM execution, avoiding data exposure to external services. The fine-tuning on pseudocode specifically targets the nuances of decompiled C code, enabling more relevant and accurate assistance for reverse engineers.

Quick Start & Requirements

  • LLM Setup: Download aidapal-8k.Q4_K_M.gguf and aidapal.modelfile from Hugging Face. Create the Ollama model with ollama create aidapal -f aidapal.modelfile.
  • Python Requirements: Install the requests library (pip install requests).
  • IDA Pro Plugin: Place the plugin file within IDA Pro's plugins directory.
  • Configuration: Edit the plugin's Python script to adjust ollama_url or the list of accessible models if not using the default http://localhost:11434 and ['aidapal'].
  • Resources: Requires Ollama and the GGUF model file (approx. 4.4 GB). Performance is hardware-dependent.

Highlighted Details

  • Integrates LLM-powered code analysis directly into the Hex-Rays decompiler.
  • Provides a context menu for initiating analysis from the Hex-Rays window.
  • Allows users to accept or reject LLM suggestions, updating the decompiled code.
  • Performance is comparable to ARM (Mx) MacBooks for the provided example.

Maintenance & Community

The project is maintained by Atredis Partners. Further details and insights are available on their blog: https://atredis.com/blog/2024/6/3/how-to-train-your-large-language-model.

Licensing & Compatibility

The repository does not explicitly state a license in the provided README. Compatibility for commercial use or closed-source linking is not specified.

Limitations & Caveats

The plugin relies on a locally running Ollama service and specific model weights, requiring manual setup and configuration. The effectiveness and speed of analysis are directly tied to the user's hardware capabilities.

Health Check
Last Commit

10 months ago

Responsiveness

1 week

Pull Requests (30d)
0
Issues (30d)
0
Star History
6 stars in the last 30 days

Explore Similar Projects

Starred by Ross Wightman Ross Wightman(Author of timm; CV at Hugging Face), Awni Hannun Awni Hannun(Author of MLX; Research Scientist at Apple), and
1 more.

mlx-llm by riccardomusmeci

0%
454
LLM tools/apps for Apple Silicon using MLX
Created 1 year ago
Updated 7 months ago
Starred by Jared Palmer Jared Palmer(Ex-VP AI at Vercel; Founder of Turborepo; Author of Formik, TSDX), Vincent Weisser Vincent Weisser(Cofounder of Prime Intellect), and
8 more.

llm-vscode by huggingface

0.1%
1k
VSCode extension for LLM-powered code development
Created 2 years ago
Updated 1 year ago
Feedback? Help us improve.