EdgeMask Logo
Docs
Quick Start

The 1-Line Integration

EdgeMask is an enterprise AI proxy that instantly adds security and optimization without rewriting your code. With a single baseURL change in seconds, you achieve a seamless Drop-in Replacement.

Drop-in Replacement

You can seamlessly route your official OpenAI SDKs through EdgeMask to access Anthropic, Gemini, or OpenAI models with zero compatibility issues. EdgeMask proxies all your LLM requests (including SSE Streaming) continuously.

Node.js (OpenAI SDK)

Just swap out the baseURL in your existing Node.js or TypeScript application.

app/api/chat.ts
TypeScript
import OpenAI from "openai";
// Easily inject EdgeMask into your existing OpenAI client
const client = new OpenAI({
apiKey: "edgemask_live_xYz123...", // Your EdgeMask API Key
baseURL: "https://edgemask-backend.onrender.com/v1/proxy/openai", // EdgeMask Proxy Endpoint
});
async function main() {
const stream = await client.chat.completions.create({
model: "gpt-4o",
messages: [{ role: "user", content: "Customer John Doe\'s credit card is 4532 1111 2222 3333." }],
stream: true,
});
for await (const chunk of stream) {
process.stdout.write(chunk.choices[0]?.delta?.content || "");
}
}
main();

Python (OpenAI SDK)

Point your Python OpenAI client to EdgeMask and seamlessly talk to Anthropic formatting.

main.py
Python
from openai import OpenAI
client = OpenAI(
api_key="edgemask_live_xYz123...", # Your EdgeMask API Key
# Call Anthropic with full OpenAI SDK compatibility without installing extra SDKs!
base_url="https://edgemask-backend.onrender.com/v1/proxy/anthropic",
)
response = client.chat.completions.create(
model="claude-3-opus-20240229",
messages=[
{"role": "user", "content": "My AWS Secret key: AKIAIOSFODNN7EXAMPLE. Please save it."}
]
)
print(response.choices[0].message.content)