News & Intel

Live

Daily AI-curated intelligence on Palantir Foundry, Ontology, AIP, Apollo, contracts, and community feedback. Updated automatically via GitHub Actions every day at 7 AM UTC.

70
Total Articles
248
Contract Awards
89
Community Feedback
5
Daily Briefings
ONTOLOGY

pbi-ontology-extractor added to PyPI

Extract semantic intelligence from Power BI .pbix files and convert to formal ontologies

Pypi.orgFeb 25, 2026
FOUNDRYONTOLOGY

Module Not Found: @foundry/functions-api in TypeScript v2 Repository

Issue Description: I am following the “Building Your First Ontology Function” speedrun. In a new TypeScript v2 repository, the environment is failing to resolve @foundry/functions-api and @foundry/ontology-api . Error Message: Cannot find module '@foundry/functions-api' or its corresponding type declarations. Details & Context: Environment: Code Repositories, TypeScript v2. Steps Taken: Import Ontology resources ( Clinic , Financial ) via the Resource Imports side panel. Restarted Code Assist and refreshed the browser multiple times. Verified functions.json has enableResourceGeneration: true . Observations: Even after adding resources, the “Generating Types” progress bar completes, but the red squiggles remain on the @foundry imports. The environment appears unable to “materialize” the virtual packages required for the logic. Desired Outcome: Seeking guidance on forcing the TypeScript v2 la

Palantir Community — LatestFeb 24, 2026
FOUNDRYONTOLOGY

Object type backing dataset

In ontology manager I have created an object type with a new backing dataset. I have create action with which I created multiple objects. In object explorer I find all the objects created. But I am not finding in the dataset. Can you please help me understand why? what needs to be done to see the data in backing dataset? Thank you!! 2 posts - 2 participants Read full topic

Palantir Community — LatestFeb 24, 2026
ONTOLOGY

Typescript function error

Hi, I was working on a typescript function where I needed to import 4 object types. Initially, when I imported the objects it showed them under the object types section. But after some time the attached error message came up and all the objects were missing. Also, I am unable to add the objects again. Is this an error from my end or something to do with the application? 2 posts - 2 participants Read full topic

Palantir Community — LatestFeb 24, 2026
ONTOLOGY

Any method to add custom icons for Ontology Object Types

Being able to add SVG to the library would be useful! 1 post - 1 participant Read full topic

Palantir Community — LatestFeb 24, 2026
FOUNDRYONTOLOGYRELEASE

Developer Console, unscoped application & Ontology SDK docs

Per a recent announcement: can an Unscoped Developer Console application automatically access and generate Ontology SDK documentation for Projects it has access to? The announcement seems to indicate yes, but I wanted to double-check with the community. ” … Developer Console applications can now be unscoped , giving you full access to Developer Console features that were previously unavailable with standalone OAuth clients, including: Documentation (OSDK, Platform APIs, development) …. “ https://www.palantir.com/docs/foundry/announcements/#create-unscoped-developer-console-applications 1 post - 1 participant Read full topic

Palantir Community — LatestFeb 23, 2026
ONTOLOGY

Pucks Dissappearing on Scheduling Gantt Chart Widget

Hello, We’re having seeing issues with pucks disappearing when dragging and dropping them onto the scheduling gantt widget. We’ve gone through every post on this community, as well as the documentation to ensure our Ontology and the widget is configured correctly. Is anyone else facing this issue? Thanks! Alexa 1 post - 1 participant Read full topic

Palantir Community — LatestFeb 23, 2026
FOUNDRYONTOLOGYRELEASE

Feature request: Add MCP tools for managing Object Type Groups

The Palantir MCP server currently has no tools for creating, updating, or assigning Object Type Groups (type groups). Groups are a core organizational primitive in Ontology Manager, but they can only be managed through the UI — there is no programmatic access via MCP, API, or SDK. Requested Tools list_object_type_groups — List all groups in an ontology create_object_type_group — Create a new group with displayName and description assign_object_type_to_group — Add an object type to one or more groups remove_object_type_from_group — Remove an object type from a group Alternatively, a typeGroups field on create_or_update_foundry_object_type would also work. 1 post - 1 participant Read full topic

Palantir Community — LatestFeb 23, 2026
FOUNDRYONTOLOGY

Ontology Manager Many to Many Link

Hi, I am building a many to many link between two objects. I am using a dataset as the join for this approach. Has anyone used Object type as the join? Does it work? Can you please provide me with detailed steps? Thank you! 2 posts - 2 participants Read full topic

Palantir Community — LatestFeb 23, 2026
ONTOLOGY

Group by and Count in the OSDK is not functioning correctly

I have an ontology object with 158 rows and a single value for source_file_name. When I execute this query: "groupBy": [ { "type": "exact", "field": "source_file_name" } ], "aggregation": [ { "type": "count", "name": "count" } ] I get this: {"excludedItems":0,"accuracy":"ACCURATE","data":[{"group":{"source_file_name":"civitas_el_download_20251217_0001.csv"},"metrics":[{"name":"count","value":158.0}]}]} I should get a count of 1. And when I run this query: "aggregation": [ { "type": "count", "name": "count" } ] I get: {"accuracy":"ACCURATE","data":[{"group":{},"metrics":[{"name":"count","value":158.0}]}]} Which is the expected result. The only what I can get the count I am looking is with: "aggregation": [ { "type": "exactDistinct", "name": "exactDistinctCount", "field": "source_file_name" } ], "where": { "type": "or", "value": [

Palantir Community — LatestFeb 23, 2026
FOUNDRYONTOLOGY

Speedrun: Mining Your First Business Process | Failed load of logs

While following the “Mining Your First Business Process” Speedrun, I am encountering a blocking error at the Log Object Type installation step (Step 2). Although the Process Ontology step completes successfully, the application fails to load the event log configuration despite the dataset existing in the project folder. { "name": "Error", "message": "Failed to load log product from Marketplace", "stack": "Error: Failed to load log product from Marketplace\n at https://jorgeochoa.euw-3.palantirfoundry.co.uk/assets/content-addressable-storage/frontend/915eba5c689d273151ae0f31632f1e5577e3481ad1df7c4f91ce7ba63aceae5e.js:5:66104\n at async https://jorgeochoa.euw-3.palantirfoundry.co.uk/assets/content-addressable-storage/frontend/915eba5c689d273151ae0f31632f1e5577e3481ad1df7c4f91ce7ba63aceae5e.js:27:440819" } 2 posts - 2 participants Read full topic

Palantir Community — LatestFeb 23, 2026
ONTOLOGY

Can solutions designer map out an already created ontology / solution design

If not, I think this would be very useful! You could guide it with your main objects and let it traverse 1 post - 1 participant Read full topic

Palantir Community — LatestFeb 23, 2026
FOUNDRYONTOLOGYAIP

OSDK security does not work with LLM proxies

Below is a client I created to work with LLM proxies in Foundry. I will get 403 forbidden when creating an access token with my OSDK client. Is this a known issue? Are personal access tokens required for LLM proxies? If so can you please fix this. import { SupportedFoundryClients, type OpenAIService, } from '@codestrap/developer-foundations-types'; import OpenAI from 'openai'; import { foundryClientFactory } from '../factory/foundryClientFactory'; import type { ChatCompletionCreateParamsStreaming } from 'openai/resources/chat'; import type { RequestOptions } from 'openai/core'; import type { ResponseCreateParamsStreaming } from 'openai/resources/responses/responses'; // ADd tpe definitions for the OpenAI response here, or in a separate file and import them in, to ensure type safety when working with the API response data. export function makeOpenAIService(): OpenAIService { const { getToken, url, ontologyRid } = foundryClientFactory( process.env.FOUNDRY_CLIENT_TYPE || S

Palantir Community — LatestFeb 23, 2026
FOUNDRYONTOLOGYAIP

What models are supported with LLM proxies?

When calling OpenAI LLM proxies the only model I can get to work so far is the gpt 4 series. IE gpt-4.1, gpt-4.1-mini etc. Is the GPT 5 series supported? How am I susppoed to know which models are supported? import { SupportedFoundryClients, type OpenAIService, } from '@codestrap/developer-foundations-types'; import OpenAI from 'openai'; import { foundryClientFactory } from '../factory/foundryClientFactory'; import type { ChatCompletionCreateParamsStreaming } from 'openai/resources/chat'; import type { RequestOptions } from 'openai/core'; import type { ResponseCreateParamsStreaming } from 'openai/resources/responses/responses'; // ADd tpe definitions for the OpenAI response here, or in a separate file and import them in, to ensure type safety when working with the API response data. export function makeOpenAIService(): OpenAIService { const { getToken, url, ontologyRid } = foundryClientFactory( process.env.FOUNDRY_CLIENT_TYPE || SupportedFoundryClients.PRIVATE, unde

Palantir Community — LatestFeb 23, 2026
ONTOLOGY

Palantir's secret weapon isn't AI – it's Ontology. An open-source deep dive

HN Discussion — 47 comments · 76 points.

Hacker NewsFeb 23, 2026
RELEASEONTOLOGY

[palantir/conjure-typescript] 5.12.0

Palantir's `conjure-typescript` library released version 5.12.0. This update primarily fixes an omission by adding the essential 'type' field to all generated `package.json` files.

Palantir GitHub — palantir/conjure-typescriptFeb 22, 2026
← PreviousPage 4 of 4