Skip to content

Commit

Permalink
Added more detailed metrics to logger that help characterize an LLM's…
Browse files Browse the repository at this point in the history
… performance.
  • Loading branch information
shailensobhee committed May 3, 2024
1 parent f4eff75 commit 4101843
Showing 1 changed file with 1 addition and 1 deletion.
2 changes: 1 addition & 1 deletion phi/llm/ollama/chat.py
Original file line number Diff line number Diff line change
Expand Up @@ -270,7 +270,7 @@ def response_stream(self, messages: List[Message]) -> Iterator[str]:
response_timer.stop()
logger.debug(f"Number of tokens generated: {completion_tokens}")
logger.debug(f"Time per output token: {response_timer.elapsed/completion_tokens:.4f}s")
logger.debug(f"Throughtput: {completion_tokens/response_timer.elapsed:.4f}s")
logger.debug(f"Throughtput: {completion_tokens/response_timer.elapsed:.4f} tokens/s")
logger.debug(f"Time to generate response: {response_timer.elapsed:.4f}s")

# -*- Create assistant message
Expand Down

0 comments on commit 4101843

Please sign in to comment.