Skip to content

Commit

Permalink
Add support to include bot message instructions in natural language u…
Browse files Browse the repository at this point in the history
…sing comments.
  • Loading branch information
drazvan committed Jul 27, 2023
1 parent 0e941ac commit 755ee8a
Show file tree
Hide file tree
Showing 5 changed files with 76 additions and 2 deletions.
3 changes: 2 additions & 1 deletion CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,15 +6,16 @@ and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0

## Unreleased


### Added

- [Event-based API](./docs/user_guide/advanced/event-based-api.md) for guardrails.
- Support for message with type "event" in [`LLMRails.generate_async`](./docs/api/nemoguardrails.rails.llm.llmrails.md#method-llmrailsgenerate_async).
- Support for [bot message instructions](docs/user_guide/advanced/bot-message-instructions.md).

### Changed

- Changed the naming of the internal events to align to the upcoming UMIM spec (Unified Multimodal Interaction Management).
comments.)

### Fixed

Expand Down
2 changes: 2 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -44,6 +44,8 @@ The user guide covers the core details of the Guardrails toolkit and how to conf
The following guides explain in more details various specific topics:

* [Extract User Provided Values](./user_guide/advanced/extract-user-provided-values.md): Learn how to extract user-provided values like a name, a date or a query.
* [Prompt Customization](./user_guide/advanced/prompt-customization.md): Learn how to customize the prompts for a new (or existing) type of LLM.
* [Bot Message Instructions](./user_guide/advanced/bot-message-instructions.md): Learn how to further tweak the bot messages with specific instructions at runtime.

## Evaluation Tools

Expand Down
44 changes: 44 additions & 0 deletions docs/user_guide/advanced/bot-message-instructions.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,44 @@
# Bot Message Instructions

If you place a comment above a `bot somethig` statement, the comment will be included in the prompt, instructing the LLM further on how to generate the message.

For example:

```colang
define flow
user express greeting
# Respond in a very formal way and introduce yourself.
bot express greeting
```

The above flow would generate a prompt (using the default prompt templates) that looks like this:

```
... (content removed for readability) ...
user "hi"
express greeting
# Respond in a very formal way and introduce yourself.
bot express greeting
```

And in this case, the completion from the LLM will be:
```
"Hello there! I'm an AI assistant that helps answer mathematical questions. My core mathematical skills are powered by wolfram alpha. How can I help you today?"
```

Whereas if we change the flow to:

```colang
define flow
user express greeting
# Respond in a very informal way and also include a joke
bot express greeting
```

Then the completion will be something like:

```
Hi there! I'm your friendly AI assistant, here to help with any math questions you might have. What can I do for you? Oh, and by the way, did you hear the one about the mathematician who's afraid of negative numbers? He'll stop at nothing to avoid them!
```

This is a very flexible mechanism for altering the generated messages.
12 changes: 12 additions & 0 deletions nemoguardrails/actions/llm/utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -75,6 +75,14 @@ def get_colang_history(
if not events:
return history

# We compute the index of the last bot message. We need it so that we include
# the bot message instruction only for the last one.
last_bot_intent_idx = len(events) - 1
while last_bot_intent_idx >= 0:
if events[last_bot_intent_idx]["type"] == "bot_intent":
break
last_bot_intent_idx -= 1

for idx, event in enumerate(events):
if event["type"] == "UtteranceUserActionFinished" and include_texts:
history += f'user "{event["final_transcript"]}"\n'
Expand All @@ -84,6 +92,10 @@ def get_colang_history(
else:
history += f'user {event["intent"]}\n'
elif event["type"] == "BotIntent":
# If we have instructions, we add them before the bot message.
# But we only do that for the last bot message.
if "instructions" in event and idx == last_bot_intent_idx:
history += f"# {event['instructions']}\n"
history += f'bot {event["intent"]}\n'
elif event["type"] == "StartUtteranceBotAction" and include_texts:
history += f' "{event["script"]}"\n'
Expand Down
17 changes: 16 additions & 1 deletion nemoguardrails/flows/flows.py
Original file line number Diff line number Diff line change
Expand Up @@ -105,6 +105,9 @@ class State:
next_step_by_flow_uid: Optional[str] = None
next_step_priority: float = 0.0

# The comment is extract from the source code
next_step_comment: Optional[str] = None

# The updates to the context that should be applied before the next step
context_updates: dict = field(default_factory=dict)

Expand Down Expand Up @@ -199,6 +202,13 @@ def _record_next_step(
new_state.next_step_by_flow_uid = flow_state.uid
new_state.next_step_priority = flow_config.priority * priority_modifier

# Extract the comment, if any.
new_state.next_step_comment = (
flow_config.elements[flow_state.head]
.get("_source_mapping", {})
.get("comment")
)


def _call_subflow(new_state: State, flow_state: FlowState) -> Optional[FlowState]:
"""Helper to call a subflow.
Expand Down Expand Up @@ -518,7 +528,12 @@ def compute_next_steps(

# If we have a next step, we make sure to convert it to proper event structure.
if state.next_step:
next_steps.append(_step_to_event(state.next_step))
next_step_event = _step_to_event(state.next_step)
if next_step_event["type"] == "bot_intent" and state.next_step_comment:
# For bot intents, we use the comment as instructions
next_step_event["instructions"] = state.next_step_comment

next_steps.append(next_step_event)

# Finally, we check if there was an explicit "stop" request
if actual_history:
Expand Down

0 comments on commit 755ee8a

Please sign in to comment.