Skip to main content

2 posts tagged with "tera"

View All Tags

· 2 min read
will rudenmalm

We're excited to announce the release of llm-chain v0.8.0, a significant update to our LLM library. This release introduces a host of improvements and new features, including a completely revamped Prompt system and more streamlined handling of Parameters. Let's dive into the details!

Revamped Prompt System

Our new Prompt system has been redesigned from the ground up to provide greater flexibility and efficiency in working with language models. In llm-chain v0.8.0, we've introduced new structs and enums to better represent chat messages and their roles, such as ChatMessage, ChatMessageCollection, and ChatRole. The Data enum has also been introduced to represent either a collection of chat messages or a single text, making it easier to work with different types of data.

Furthermore, we've created a more powerful PromptTemplate system that allows you to format prompts with a set of parameters. This enables you to dynamically generate prompts for your language models without the need for cumbersome string manipulation.

Executors No Longer Handle Parameters

With the release of llm-chain v0.8.0, we've shifted the responsibility of handling Parameters from the executors to the main llm-chain crate. This change simplifies the process of working with executors, allowing developers to focus more on the core functionality of their language models.

What's Next?

This release marks a significant step forward in the evolution. However, we're not stopping here! We'll continue to refine and expand the capabilities of llm-chain, making it even more powerful and user-friendly.

We encourage you to check out llm-chain v0.8.0 and experience the benefits of the improved Prompt system and streamlined handling of Parameters. As always, we appreciate your feedback and contributions to help make llm-chain the best language model library out there.

Upgrade to llm-chain v0.8.0 today and take your language models to the next level!

· 2 min read
will rudenmalm

We are thrilled to announce the release of llm-chain v0.6.0, which introduces significant enhancements to our library. This update focuses on making the llm-chain more robust and versatile, allowing developers to build even more advanced applications with ease.

Major updates

1. The switch to the tera template language

One of the most significant changes in this release is the introduction of the tera template language. This powerful and flexible templating system enables developers to create dynamic and complex templates for their projects. The tera language allows for more advanced control structures and filters, making it a substantial upgrade from the previous templating system.

2. Improved prompt system

Another notable update is the revamped prompt system. With llm-chain v0.6.0, the prompt system now supports both Chat and completion-style models. This improvement means developers no longer need to worry about whether they are using a completion or chat model when crafting prompts. This unified approach simplifies the development process and makes it easier to work with various types of language models.

3. Updated LLaMA.cpp

The latest version of LLaMA.cpp has been integrated into this release, ensuring better performance and stability for your projects.

Other improvements

1. Safer error handling

In addition to the major updates, llm-chain v0.6.0 also brings improvements to error handling. Templates now return Result rather than panicking on errors, making it more convenient to handle any issues that may arise during development. Similarly, Executors also return Result instead of panicking on errors, providing a more consistent and safer API.

Time to move on from the old templating system

With the introduction of the tera template language, we strongly recommend moving away from the old templating system. This update provides a solid foundation for building even more advanced applications using the llm-chain library.

We hope you're as excited about these enhancements as we are! As always, we appreciate your feedback and support. If you have any questions or need help, please don't hesitate to reach out on Discord !

Happy coding! 🚀