A gradio-based UI for accessing the Adobe Substance 3D API, focusing on the "Generate 3D Object Composite" endpoint.
Important
In this section it's assumed that you have Git, Python and Anaconda properly installed on your machine.
Follow these steps to install SubstanceAI GUI:
git clone https://github.com/brayevalerien/SubstanceAI-GUIcd SubstanceAI-GUIconda create -n substanceai-gui -y python=3.12 && conda activate substanceai-guipip install -r requirements.txt
Once you've installed the project, run python webui.py to start the UI. Follow these steps to prepare and send your request to the Substance API:
- Add your Bearer token if you aren't using Client credentials. If you are using Client credentials (
client_idandclient_secret), set theCLIENT_IDand theCLIENT_SECRETenvironment variables or create a.envfile, and restart the application. - Load a 3D scene. IMPORTANT: Your scene must have been exported either as a
.usdz(best) or a.glbfile - Write a prompt
- Write the exact name of the hero object (your product) and the camera you want to use for rendering. Use the names that are used in the scene, otherwise the API will not be able to find your objects.
- Choose an image generation model to render the background and the lighting. Please note that Image 4 Ultra gives higher quality results but is still in an experimental state, so currently only one image can be generated at a time with this model.
- Choose the image count and the seed
- Set your resolution (note that the resolution set in your scene will be ignored)
- Optionally, choose a content class (image style, can be "photo" or "art") and/or a style image (adjust the style image strength using the slider).
- Send the request by hitting the "Generate" button.
The generation takes from a few seconds to a couple of minutes and the generated image will appear in the main image frame in the middle of the UI. Note that if you're uploading a large file for the first time, the process will be longer. If you're reusing the same assets though, the files will not have to be re-uploaded and the generation will happen faster.
For debugging purposes, you can inspect what has been sent to the API and what the response by opening the "Data exchange (dev view)".
