LMQL 0.7.1
This is a bug fix release, addressing smaller issues with 0.7.
- Fix issue with distribution clause and inference tracing
- make 'random' model independent of chunk_size
- optimize automatic chunk_size selection to reduce the number of LLM calls (max_tokens hinting)
- support for direct generation based on a list of OpenAI Chat format dictionaries