The Langtail Playground is a prompt engineer’s dream. It offers prompt templating, variables, tools, versioning, sharing, and more in a streamlined interface that helps you build and ship faster.

Here’s a quick overview of its interface:

Screenshot of the Langtail Playground

  1. Templates Panel: Here you can incorporate ‘System’, ‘Assistant’, and ‘User’ templates into your prompts for use during deployment. Insert variables using {{myVariable}} and format prompts with the templating syntax.
  2. Variables Panel: Here you can find and modify the variables defined within your templates to refine your prompt’s outputs.
  3. Tools Panel: Here you can create a custom tool specification that can enrich the AI model’s response. If the AI model decides to use the tool, the Assistant response will be a function call. You can mock the response, send it back to the model, and see what the final output will be.
  4. Parameters Panel: Here you can choose an AI model and adjust its parameters to tailor the model’s behavior to your project.
  5. Send: Click to invoke the prompt and display the response.
  6. Messages Area: Here you can see the responses generated by the model after invoking the prompt.
  7. Save: Click to save your prompt and, if desired, create a named version.
  8. Share: Click to share your prompt with your team or publicly for the world to see.
  9. Deploy: Click to publish your prompt as an API into various environments. After deploying, click the “Deployments” tab to see the API endpoint and how to call it.
  10. Prompt versions: Here you can access and manage previous versions of your prompt and easily revert if necessary.