LangServe
LangServe makes it easy to deploy LangChain applications as RESTful APIs.
Overview
LangChain allows you to build complex applications running the latest AI models using powerful, easy-to-reason-about abstractions. You can choose the best model for your project and then customize prompts and temperatures to get the best results out of LLMs.
LangServe gives you the tools you need to deploy LangChain runnables and chains quickly and easily as RESTful APIs. It is built using FastAPI to deliver high performance, production-ready services.
This demo application deploys a simple page allowing you to prompt a chatbot for information. The bot will respond to queries with general information about the topic provided.
Try it out
To deploy this application, you'll need to sign up for an OpenAI account and create an API key.
When you deploy the application, fill in the OPENAI_API_KEY
environment variable with your API key value.
Once the LangServe application is deployed, you can view the prompt page by visiting the Koyeb App URL with /openai/playground
appended to the end:
https://<YOUR_APP_NAME>-<YOUR_KOYEB_ORG>.koyeb.app/openai/playground