Adobe has been on a steady course of adding generative AI features powered by Firefly to its tool sets. The company’s most recent software to benefit from direct GenAI feature integration is Substance 3D, within the Sampler and Stager tools. Beta versions of the new Firefly features are available to Substance 3D subscribers.
Adobe’s Firefly has taken flight yet again. Ever since Firefly began generating buzz among creatives, the industry has been bitten with the GenAI bug. Firefly-powered tools and capabilities have transformed Illustrator, Express, and Photoshop, and quickly spread to the video realm within Premiere Pro and After Effects. Adobe even hatched a stand-alone Web application, announced three new Firefly models last fall, and released GenStudio for the enterprise. In fact, by the time Adobe Max 2023 rolled around, the company had released over 100 new AI features and updates. And those features keep coming. This week, Firefly generative AI features (beta) are now available directly within the Substance 3D ecosystem.
The infusion of Firefly into Substance 3D’s Sampler and Stager tools will help boost creative workflows, speeding up iterative and creative processes involving 3D texture generation and background creation, thanks to the new GenAI features.
“Generative tools are at their best when they blend seamlessly into established creative workflows. That’s the philosophy driving Substance 3D’s approach to generative capabilities in Substance apps,” says Francois Cotton, senior director of 3D product marketing at Adobe.
With the Substance 3D Sampler Text to Texture feature, which is powered by Firefly, users can easily craft photorealistic or stylized textures for 3D object surfaces, without wasting time searching for input images. Artists start by describing what they need for the base material and choose from the best results and add them to a layer. Artists can then refine the results or make variations by altering their prompt without having to exit Sampler. The results produce square, tileable images with proper perspective. Adobe says the generated material is analyzed by Sampler’s Image to Material function to create a material from the base input texture. The Image to Material feature produces normal, height, and roughness maps, resulting in a parametric material. The material can be applied to an asset in Painter or another 3D application and then used in gaming, VFX, or product design.
The Substance 3D Stage tool also uses text prompts to generate detailed backgrounds for product visualizations, enabling users to stage a scene in seconds. Using Stager’s new Generative Background feature, a simple text description will produce a rich setting whose perspective and lighting are automatically aligned in perfect composition with the ML-driven Match Image function.
Adobe has included its Content Credentials branding to resulting imagery, as well as other Firefly-created or -edited content, denoting that generative AI has been used in the creative process.
The GenAI Stager and Sampler beta apps are available now for Substance 3D subscribers.