Why use the proxy server ?
By simply changing thebaseURL
, you can instantly add a memory layer to your LLM calls without altering your existing code. The proxy automatically handles storing and retrieving context across conversations, making your LLM smarter and more context-aware. Instead of managing memory, state, or custom logic in your app, the proxy does the heavy lifting, so your model responses feel coherent, continuous, and truly conversational.
Get the AlchemystAI API key.
Sign up and get you api key here : Alchemyst platform. Set your API key via environment variable or pass it directly when initializing the anthropic client.- Recommended env var:
ALCHEMYST_AI_API_KEY