aidapal  by atredispartners

IDA Pro plugin for code analysis

created 1 year ago
318 stars

Top 86.3% on sourcepulse

GitHubView on GitHub
Project Summary

aiDAPal is an IDA Pro plugin designed to enhance reverse engineering workflows by integrating a locally hosted Large Language Model (LLM) fine-tuned for Hex-Rays pseudocode. It assists analysts by providing context-aware code suggestions and analysis directly within the IDA Pro environment, aiming to accelerate the reverse engineering process.

How It Works

The plugin leverages Ollama's API to interact with a specialized LLM, aidapal-8k.Q4_K_M.gguf, fine-tuned on Hex-Rays pseudocode. This approach allows for local, private LLM execution, avoiding data exposure to external services. The fine-tuning on pseudocode specifically targets the nuances of decompiled C code, enabling more relevant and accurate assistance for reverse engineers.

Quick Start & Requirements

  • LLM Setup: Download aidapal-8k.Q4_K_M.gguf and aidapal.modelfile from Hugging Face. Create the Ollama model with ollama create aidapal -f aidapal.modelfile.
  • Python Requirements: Install the requests library (pip install requests).
  • IDA Pro Plugin: Place the plugin file within IDA Pro's plugins directory.
  • Configuration: Edit the plugin's Python script to adjust ollama_url or the list of accessible models if not using the default http://localhost:11434 and ['aidapal'].
  • Resources: Requires Ollama and the GGUF model file (approx. 4.4 GB). Performance is hardware-dependent.

Highlighted Details

  • Integrates LLM-powered code analysis directly into the Hex-Rays decompiler.
  • Provides a context menu for initiating analysis from the Hex-Rays window.
  • Allows users to accept or reject LLM suggestions, updating the decompiled code.
  • Performance is comparable to ARM (Mx) MacBooks for the provided example.

Maintenance & Community

The project is maintained by Atredis Partners. Further details and insights are available on their blog: https://atredis.com/blog/2024/6/3/how-to-train-your-large-language-model.

Licensing & Compatibility

The repository does not explicitly state a license in the provided README. Compatibility for commercial use or closed-source linking is not specified.

Limitations & Caveats

The plugin relies on a locally running Ollama service and specific model weights, requiring manual setup and configuration. The effectiveness and speed of analysis are directly tied to the user's hardware capabilities.

Health Check
Last commit

8 months ago

Responsiveness

Inactive

Pull Requests (30d)
0
Issues (30d)
0
Star History
30 stars in the last 90 days

Explore Similar Projects

Feedback? Help us improve.