Talk | 
MLCon 2024
 | 28.11.2024

Talk to your systems: Integrating Gen AI into your architectures with structured LLM output

Talking to your data (aka RAG) is the 'Hello World' use case for LLMs. But there is much more to explore. Based on their understanding of the human language, LLMs can be used to drive innovative user interactions for applications and systems. In this session, Christian demonstrates how to use structured data output with data schemas and function calling to interconnect your APIs with the power of LLMs. Discover how to unlock the potential of your solutions by harnessing the transformative nature of Generative AI. Join this session and let's talk to your systems!

Christian Weyer
Christian Weyer is co-founder and CTO of Thinktecture. He’s been creating software for more than two decades.

Event

MLCon 2024
25.11.24  
- 29.11.24 
@ Berlin
 (DE)
MLC24_BER

Slidedeck

More articles about Generative AI, Instructor, LLM, Ollama, OpenAI, Python, SLM

AI
favicon
With the rise of powerful AI models and services, questions come up on how to integrate those into our applications and make reasonable use of them. While other languages like Python already have popular and feature-rich libraries like LangChain, we are missing these in .NET and C#. But there is a new kid on the block that might change this situation. Welcome Semantic Kernel by Microsoft!
03.05.2023

Our webinars

Our articles

More about us