💬 This action works best in chat mode and is designed to process the most recent user message or full conversation.
🔍 What It Does
It sends your prompt to the LLM and expects a structured result in return — usually a comma-separated or JSON-style response that can be saved into a parameter.🖼️ Action Interface

⚙️ Configuration Options
Custom Prompt (required)
Custom Prompt (required)
Type:
string
Provide a prompt instructing the assistant on what to extract from the conversation.Example:Extract the product name, issue type, and urgency level from the user’s message.Or for simpler use:
Return a list of keywords related to the user’s question.
🎯 Use Cases
- Extract customer issue tags for routing
- Identify urgency or sentiment from a request
- Convert a sentence into variables for logic or filtering
🧠 Tips
- Use specific instructions in your prompt like: “Return tags in a comma-separated list” or “List them as JSON fields”
- Combine with Set Current Flow, Function, or Return Value actions for dynamic control