AIP (AI Platform) — Complete Overview & Architecture
AIP (AI Platform) — Complete Overview & Architecture
What is AIP?
Palantir AIP (AI Platform) is the AI layer built on top of Foundry and the Ontology. It enables organizations to deploy production-grade AI agents, copilots, and AI-powered workflows that act on real enterprise data — all with built-in governance, auditability, and security.
AIP was announced at Palantir Forward 2023 and has become Palantir's fastest-growing product, driving significant commercial revenue.
AIP Architecture Components
1. AIP Logic
AIP Logic is a serverless function environment where TypeScript functions call LLMs and interact with the Ontology. Functions can:
- Retrieve Ontology objects as context for LLM prompts
- Apply Ontology Actions to update business data
- Chain multiple LLM calls in a pipeline
- Return structured outputs consumed by Workshop, Copilot, or external APIs
// AIP Logic Function Example
import { Function, Context } from '@osdk/aip';
import { MissionPlan, ThreatAssessment } from './ontology-objects';
export const analyzeThreatAndSuggestAction = Function({
name: 'analyze-threat',
description: 'Analyzes a threat assessment and suggests a mission response',
parameters: {
threatId: { type: 'string', description: 'Threat assessment object ID' },
},
async execute({ threatId }, context: Context) {
// 1. Fetch Ontology object — real, live data
const threat = await context.ontology.ThreatAssessment.get(threatId);
// 2. Call LLM with grounded context
const analysis = await context.llm.complete({
model: 'gpt-4o',
system: 'You are a military planning assistant. Analyze the threat and recommend action.',
user: `Threat Data: ${JSON.stringify(threat)}
Provide: severity rating, recommended response, estimated timeline.`,
outputFormat: { severity: 'string', response: 'string', timeline: 'string' },
});
// 3. Apply an Action to record the analysis
await context.ontology.actions.recordThreatAnalysis({
threat,
aiAnalysis: analysis.response,
severity: analysis.severity,
});
return analysis;
},
});
2. AIP Copilot
Copilots are AI assistants embedded into Workshop applications. They use AIP Logic functions as tools, allowing users to ask natural-language questions and trigger business operations.
Copilot Anatomy:
- Trigger: User types a message in a Workshop Copilot widget
- Intent Detection: LLM identifies which AIP Logic function to invoke
- Function Call: AIP executes the function with extracted parameters
- Response: Structured output formatted as a natural-language response
3. AIP Studio
AIP Studio is a visual agent builder — drag and drop LLM nodes, data retrieval steps, actions, and conditional branches to create multi-step AI workflows without code.
Studio Node Types:
- LLM Node: Call any configured LLM model
- Ontology Query Node: Retrieve objects by filter
- Action Node: Apply an Ontology action
- Branch Node: Conditional logic based on LLM output
- Loop Node: Iterate over a list of objects
4. Function Repository
TypeScript functions that extend the Ontology — callable from AIP Logic, Workshop widgets, Copilot, and external APIs via the OSDK.
Supported LLMs in AIP
| Provider | Models |
|---|---|
| OpenAI | GPT-4o, GPT-4 Turbo, GPT-3.5 Turbo |
| Anthropic | Claude 3.5 Sonnet, Claude 3 Opus |
| Meta | Llama 3.1 (via Azure/AWS) |
| Gemini 1.5 Pro, Gemini 1.0 | |
| Azure OpenAI | All GPT-4 variants |
| Palantir-hosted | Fine-tuned models in secure enclaves |
AIP Security & Governance
Critical for enterprise adoption:
- All LLM inputs/outputs are logged and auditable
- Markings are enforced — LLMs only see data the user can access
- No training on customer data (explicit Palantir policy)
- RBAC controls which users can access which AIP functions
- PII/PHI masking can be applied before LLM calls
Real-World AIP Use Cases
- Defense: Autonomous mission planning with threat assessment data from the Ontology
- Healthcare: AI-assisted clinical decision support with live patient data
- Finance: Automated regulatory compliance review of contracts and filings
- Supply Chain: Predictive disruption alerts with AI-generated mitigation plans
- Government: Benefits eligibility determination with auditability for appeals
Getting Started with AIP Logic
# Install Palantir CLI
npm install -g @osdk/cli
# Authenticate to your Foundry stack
osdk auth login --foundry-url https://your-stack.palantirfoundry.com
# Create a new Function Repository
osdk function create my-aip-functions
# Develop locally with live Ontology access
osdk function dev
AIP vs. Direct LLM API Calls
| Feature | Direct LLM API | Palantir AIP |
|---|---|---|
| Live enterprise data | ❌ Manual RAG | ✅ Automatic via Ontology |
| Audit trail | ❌ None | ✅ Full audit log |
| Access controls | ❌ None | ✅ Marking-aware |
| Action execution | ❌ Custom code | ✅ Built-in via Actions |
| Hallucination risk | ⚠️ High | ✅ Low — grounded data |
| Deployment | ❌ Custom infra | ✅ Managed by Foundry |