Cloud Azure | Skills Pool
Cloud Azure Azure cloud deployment. App Service, Functions, Container Instances, Azure OpenAI.
frvnkfrmchicago 0 스타 2026. 3. 9. Azure Cloud
Deploy and scale on Microsoft Azure.
See also: agents/cloud-aws/SKILL.md, agents/cloud-google/SKILL.md
Context Questions
Before deploying to Azure:
Why Azure? — Enterprise/Microsoft shop, Azure OpenAI, existing Azure credits
What's the deployment model? — App Service (PaaS), Functions (serverless), containers
What AI needs? — Azure OpenAI for GPT-4, embeddings, enterprise compliance
What's the DevOps setup? — Azure DevOps, GitHub Actions, or manual
Infrastructure as code? — Bicep, Terraform, or portal
Dimensions
npx skillvault add frvnkfrmchicago/frvnkfrmchicago-skills-library-v2-agents-cloud-azure-skill-md
스타 0
업데이트 2026. 3. 9.
직업 Compute App Service ←→ Functions ←→ ACI ←→ AKS AI None ←→ Azure OpenAI ←→ Cognitive Services IaC Portal (manual) ←→ Bicep ←→ Terraform CI/CD GitHub Actions ←→ Azure DevOps Secrets Environment vars ←→ Key Vault
Derivation Logic If Context Is... Then Consider... Enterprise/Microsoft shop Azure (natural fit) Need Azure OpenAI Required for GPT-4 with enterprise compliance .NET backend App Service (best support) Serverless Azure Functions Container workloads ACI (simple) or AKS (complex) Already using GitHub GitHub Actions over Azure DevOps
TL;DR Service Use Case App Service Web apps, APIs Functions Serverless, event-driven Container Instances Quick containers Azure OpenAI GPT-4, embeddings Key Vault Secrets DevOps CI/CD pipelines
Part 1: Azure App Service
Deploy Next.js App # Install Azure CLI
brew install azure-cli
# Login
az login
# Create resource group
az group create --name myapp-rg --location eastus
# Create App Service plan
az appservice plan create \
--name myapp-plan \
--resource-group myapp-rg \
--sku B1 \
--is-linux
# Create web app
az webapp create \
--name myapp-prod \
--resource-group myapp-rg \
--plan myapp-plan \
--runtime "NODE:20-lts"
# Deploy from GitHub
az webapp deployment source config \
--name myapp-prod \
--resource-group myapp-rg \
--repo-url https://github.com/user/repo \
--branch main \
--manual-integration
Staging Slots # Create staging slot
az webapp deployment slot create \
--name myapp-prod \
--resource-group myapp-rg \
--slot staging
# Deploy to staging
az webapp deployment source config \
--name myapp-prod \
--resource-group myapp-rg \
--slot staging \
--repo-url https://github.com/user/repo \
--branch develop
# Swap staging to production
az webapp deployment slot swap \
--name myapp-prod \
--resource-group myapp-rg \
--slot staging \
--target-slot production
Environment Variables # Set app settings
az webapp config appsettings set \
--name myapp-prod \
--resource-group myapp-rg \
--settings \
DATABASE_URL="@Microsoft.KeyVault(SecretUri=https://myvault.vault.azure.net/secrets/db-url)" \
NODE_ENV="production"
Part 2: Azure Functions
HTTP Trigger (TypeScript) // src/functions/hello.ts
import { app, HttpRequest, HttpResponseInit, InvocationContext } from "@azure/functions";
export async function hello(
request: HttpRequest,
context: InvocationContext
): Promise<HttpResponseInit> {
context.log("HTTP trigger function processed a request.");
const name = request.query.get("name") || (await request.text()) || "World";
return {
status: 200,
jsonBody: { message: `Hello, ${name}!` },
};
}
app.http("hello", {
methods: ["GET", "POST"],
authLevel: "anonymous",
handler: hello,
});
Queue Trigger // src/functions/processQueue.ts
import { app, InvocationContext } from "@azure/functions";
interface QueueMessage {
userId: string;
action: string;
}
export async function processQueue(
message: QueueMessage,
context: InvocationContext
): Promise<void> {
context.log("Queue trigger processed:", message);
// Process the message
await handleUserAction(message.userId, message.action);
}
app.storageQueue("processQueue", {
queueName: "user-actions",
connection: "AzureWebJobsStorage",
handler: processQueue,
});
Timer Trigger (Cron) // src/functions/dailyCleanup.ts
import { app, InvocationContext, Timer } from "@azure/functions";
export async function dailyCleanup(
timer: Timer,
context: InvocationContext
): Promise<void> {
context.log("Timer trigger executed at:", new Date().toISOString());
if (timer.isPastDue) {
context.log("Timer is past due!");
}
await performCleanup();
}
app.timer("dailyCleanup", {
// Every day at 2 AM UTC
schedule: "0 0 2 * * *",
handler: dailyCleanup,
});
Deploy Functions # Create Function App
az functionapp create \
--name myapp-functions \
--resource-group myapp-rg \
--consumption-plan-location eastus \
--runtime node \
--runtime-version 20 \
--functions-version 4 \
--storage-account mystorageaccount
# Deploy
func azure functionapp publish myapp-functions
Part 3: Azure Container Instances
Quick Container Deployment # Deploy container
az container create \
--name myapp-container \
--resource-group myapp-rg \
--image myregistry.azurecr.io/myapp:latest \
--cpu 1 \
--memory 1.5 \
--ports 80 \
--environment-variables \
DATABASE_URL="..." \
NODE_ENV="production"
# Get public IP
az container show \
--name myapp-container \
--resource-group myapp-rg \
--query ipAddress.ip \
--output tsv
With Azure Container Registry # Create registry
az acr create \
--name myregistry \
--resource-group myapp-rg \
--sku Basic
# Login to registry
az acr login --name myregistry
# Build and push
docker build -t myregistry.azurecr.io/myapp:latest .
docker push myregistry.azurecr.io/myapp:latest
# Deploy with registry credentials
az container create \
--name myapp-container \
--resource-group myapp-rg \
--image myregistry.azurecr.io/myapp:latest \
--registry-login-server myregistry.azurecr.io \
--registry-username $(az acr credential show -n myregistry --query username -o tsv) \
--registry-password $(az acr credential show -n myregistry --query passwords[0].value -o tsv)
ACI vs AKS Decision Use ACI Use AKS Simple, single containers Complex multi-container apps Quick deployments Need orchestration Dev/test environments Production at scale Batch jobs Long-running services
Part 4: Azure OpenAI Service
Setup # Create Azure OpenAI resource
az cognitiveservices account create \
--name myapp-openai \
--resource-group myapp-rg \
--kind OpenAI \
--sku S0 \
--location eastus
# Deploy a model
az cognitiveservices account deployment create \
--name myapp-openai \
--resource-group myapp-rg \
--deployment-name gpt-4o \
--model-name gpt-4o \
--model-version "2024-08-06" \
--model-format OpenAI \
--sku-capacity 10 \
--sku-name Standard
Chat Completions (Python) from openai import AzureOpenAI
client = AzureOpenAI(
api_key=os.environ["AZURE_OPENAI_API_KEY"],
api_version="2024-08-01-preview",
azure_endpoint=os.environ["AZURE_OPENAI_ENDPOINT"],
)
response = client.chat.completions.create(
model="gpt-4o", # deployment name
messages=[
{"role": "system", "content": "You are a helpful assistant."},
{"role": "user", "content": "What is the capital of France?"},
],
temperature=0.7,
max_tokens=500,
)
print(response.choices[0].message.content)
Chat Completions (TypeScript) import { AzureOpenAI } from "openai";
const client = new AzureOpenAI({
apiKey: process.env.AZURE_OPENAI_API_KEY,
apiVersion: "2024-08-01-preview",
endpoint: process.env.AZURE_OPENAI_ENDPOINT,
});
async function chat(prompt: string): Promise<string> {
const response = await client.chat.completions.create({
model: "gpt-4o", // deployment name
messages: [
{ role: "system", content: "You are a helpful assistant." },
{ role: "user", content: prompt },
],
temperature: 0.7,
max_tokens: 500,
});
return response.choices[0].message.content ?? "";
}
Embeddings response = client.embeddings.create(
model="text-embedding-3-small", # deployment name
input="The quick brown fox jumps over the lazy dog",
)
embedding = response.data[0].embedding
print(f"Embedding dimension: {len(embedding)}")
Content Filtering Azure OpenAI has built-in content filtering:
02
Context Questions
클라우드
Cloudflare Comprehensive Cloudflare platform skill covering Workers, Pages, storage (KV, D1, R2), AI (Workers AI, Vectorize, Agents SDK), networking (Tunnel, Spectrum), security (WAF, DDoS), and infrastructure-as-code (Terraform, Pulumi). Use for any Cloudflare development task.
Mcp Integration This skill should be used when the user asks to "add MCP server", "integrate MCP", "configure MCP in plugin", "use .mcp.json", "set up Model Context Protocol", "connect external service", mentions "${CLAUDE_PLUGIN_ROOT} with MCP", or discusses MCP server types (SSE, stdio, HTTP, WebSocket). Provides comprehensive guidance for integrating Model Context Protocol servers into Claude Code plugins for external tool and service integration.
클라우드
Setup Deploy Configure deployment settings for /land-and-deploy. Detects your deploy
platform (Fly.io, Render, Vercel, Netlify, Heroku, GitHub Actions, custom),
production URL, health check endpoints, and deploy status commands. Writes
the configuration to CLAUDE.md so all future deploys are automatic.
Use when: "setup deploy", "configure deployment", "set up land-and-deploy",
"how do I deploy with gstack", "add deploy config".