Discover and explore top open-source AI tools and projects—updated daily.
dobrado76Stable Diffusion image generation for Unity Editor workflows
Top 100.0% on SourcePulse
Summary
This project offers a Unity Editor component for integrating Stable Diffusion Automatic 1111 web UI, enabling AI image generation directly within Unity. It streamlines workflows for creating 3D model textures, normal maps, and UI assets, accelerating content creation pipelines for game developers and designers.
How It Works
The integration connects to a Stable Diffusion Automatic 1111 server (local or cloud) via its API. Users configure server settings and generation parameters using StableDiffusionConfiguration components and SDSettings ScriptableObjects. The tool supports text-to-image for 3D model texturing (with optional normal/bump maps) and UI elements, plus image-to-image for UI components.
Quick Start & Requirements
Stable-Diffusion-Unity-Integration.unitypackage.--api argument (e.g., set COMMANDLINE_ARGS= --api in webui-user.bat).StableDiffusionIntegration/Scenes/DemoScene.unity) and configure the StableDiffusionConfiguration component with the server URL (default: http://127.0.0.1:7860/).Highlighted Details
Maintenance & Community
Contributions from UnityCoder, ALBRRT, FOXYTOCIN, PeixuanL, and TeoVibe focus on features like normal map generation, image-to-image, authentication, URP materials, and online server support. No explicit community channels are listed.
Licensing & Compatibility
Licensed under LGPL v2.1. Permits free use, modification, and distribution, including for commercial projects, provided derivative works are shared under the same license and the original source is attributed. Generated artwork requires no attribution.
Limitations & Caveats
Material generation components target Built-in and URP; HDRP support is experimental. Tested on Unity 2019-2021 and Unity 6, with expected broader compatibility.
3 months ago
Inactive
Tencent-Hunyuan