This app lets you talk to your ETABS models using AI. It includes an AI agent with several tools to:
- Query model results
- Post-process results
- Design components based on results
You can ask the AI agent many questions like:
- "List all load combinations"
- "What is the mass of the model?"
- "What is the lowest modal period?"
You can even ask what the agent can do! It can also generate visuals, as shown in the sample chat below:
You can not only ask for results but also process them. For example, you can make a heat map of the reaction loads, plot deformed shapes, or plot internal loads:
All model results are available to support design workflows. For example, the AI agent can help design pad foundations. It will ask for the soil bearing pressure and the load case, then suggest the foundation size. It can also calculate the required footing size:
The app needs access to the OpenAI API to function properly. It uses structured outputs, which help you retrieve data in predictable and easily manageable formats. To simplify working with structured outputs, the app leverages the Instructor framework. Instructor makes it straightforward to define how you want your model responses structured. You can quickly learn how to use Instructor in just a few minutes here. Additionally, Instructor provides flexibility to easily switch between different LLM providers , like Anthropic, without significant code changes.
Here's a walkthrough of how the chat integration and visualization workflow come together:
-
First, you set up
vkt.Chat
in yourvkt.Parametrization
, which takes care of managing user conversations. When a user submits a message, your controller gets triggered. -
Inside the controller method linked to
vkt.Chat
, fetch the entire conversation history usingparams.chat.get_messages()
and pass these messages to your LLM. -
The conversation is sent to the LLM response endpoint. You can send it directly to the LLM provider like this: OpenAI Quickstart, or use a framework like a middleman, as we did in this app using Instructor.
-
The LLM can respond in two ways:
- Plain text: Directly update the conversation by calling:
vkt.ChatResult(params.chat, "Your simple reply here.")
- Structured data (like a tool call): Continue with the next steps.
- Plain text: Directly update the conversation by calling:
-
If the LLM returns structured data or a tool call, proceed to:
- Map the data to a corresponding function in
app/tools
. - Execute the function to produce a Plotly figure.
- Serialize the figure to JSON and save it using VIKTOR Storage:
vkt.Storage().set( "view", data=vkt.File.from_data(figure.to_json().encode()), scope="entity" )
- Map the data to a corresponding function in
-
Finally, your view (defined with
@vkt.PlotlyView
) retrieves the stored JSON, recreates the Plotly figure, and displays it:raw = vkt.Storage().get("view", scope="entity").getvalue() fig = go.Figure(json.loads(raw)) return vkt.PlotlyResult(fig.to_json())```
-
The app manages storage, deleting or updating stored data when inputs change, ensuring the views remain current.
For local development, securely store your OpenAI API key in a .env
file in the project's root directory (see .env.example
). The key is loaded with the python-dotenv
module. Always keep this key secret and never expose or commit it publicly!
For published apps, manage your API keys using VIKTOR’s environment variables. Maintainers and administrators can set environment variables via the 'Apps' menu in your VIKTOR environment. Environment variables are encrypted with AES encryption. Here's a quick overview:
- Go to the Apps page, find your app, and click Details.
- Select the Variables tab to add, update, or delete environment variables.
- Use the Add Variable button to create a new variable. Variables marked as secret cannot be read again, only deleted. Regular variables can be viewed and edited later.
For detailed instructions, please visit the official VIKTOR environment variables documentation.
To help you quickly get up to speed with the app and dive deeper into specific components, here are several useful resources and tutorials:
- Blog: How Engineers Can Use AI Agents and MCP Servers to Work Smarter: Basic concepts about agent tools in engineering.
- Blog: AI-powered ETABS Model Post-Processing Using Python and VIKTOR: A blog about this repo!
- Instructor Framework – Integrations Documentation: Explore supported integrations for different AI providers using the Instructor framework.
- OpenAI Cookbook: Follow step-by-step guides to work with OpenAI models.
- VIKTOR Chat Component (
vkt.Chat
): Understand how to implement interactive LLM-based chat interfaces within your VIKTOR apps. - VIKTOR Storage Component (
vkt.Storage
): Learn about efficiently storing and retrieving persistent data within your apps. - VIKTOR Plotly View Component (
vkt.PlotlyView
): Find out how to render interactive visualizations and charts.