The LLM Hints in Widget action allows your assistant to generate intelligent hint suggestions using a custom LLM prompt. These hints are then displayed as clickable options inside the chat widget.
✨ Powered by your own language model prompt. Ideal for dynamic and context-aware hint generation.

πŸ” What It Does

Instead of defining static hint options, this action asks the assistant to generate relevant choices using a custom prompt. The output can vary based on the conversation, user input, or session parameters.
πŸ”Ž Perfect for smart search suggestions, personalized onboarding, or adaptive FAQ prompts.

πŸ–ΌοΈ Action Interface

LLM Generate Hints UI

βš™οΈ Configuration Options


🧡 Tips

  • Use conversational context with @parameters in the prompt.
  • Keep the prompt direct and focused on generating options (e.g., questions, intents, next actions).
  • Useful when different users might need different help pathways.
Want help designing prompt templates for this? I can assist!