Data Apps
DataApp Overview
The Reasoning Flows component for orchestrating complete, data-driven applications.
🧭 Purpose
The DataApp object represents a complete data-driven application within Reasoning Flows.
It acts as the central orchestration point where related components — such as data preparation tools, notebooks, machine learning models, APIs, and visualizations — are logically grouped into a unified structure.
By creating a DataApp, teams can structure projects into cohesive, deployable units, simplifying management, navigation, and execution across the Reasoning Flows ecosystem.
🔹 Where It Fits in Reasoning Flows
Within the Reasoning Flows architecture:
- Extract & Load → Ingests and structures data.
- Transform & Prepare → Cleans and formats data for analytics.
- AI & Machine Learning → Builds and trains predictive or generative models.
- Visual Objects → Delivers insights through notebooks, dashboards, or APIs.
- Data Models (Knowledge Nodes) → Define logical entities and metadata relationships.
- DataApp → Brings all of these components together into a single, operational application.
Goal: The DataApp is the execution and delivery layer — turning Reasoning Flows projects into deployable, interactive applications.
⚙️ Key Features
-
🧩 Application Context
Defines a complete data application inside Reasoning Flows, combining ETL pipelines, ML models, APIs, and visual assets. -
🧠 Logical Grouping
Groups related development objects into a single project scope — enabling consistent orchestration and versioning. -
⚙️ Workflow Management
Serves as a control hub for scheduling, dependency tracking, and execution order. -
🚀 Deployment Integration
Can be included in deployment processes to define the boundaries and logic of an entire application.
💡 Recommended Use Cases
- Building and deploying a data product or AI-powered service (e.g., forecasting app, semantic search API).
- Structuring multi-component solutions that involve data pipelines, model training, and inference endpoints.
- Managing application-level dependencies across diverse Reasoning Flows objects.
- Creating a logical hierarchy for reusable assets within enterprise-scale environments.
🖼️ Visual Example

Example: A DataApp connecting ETL pipelines, prepared tables, AutoML models, and API endpoints into a single deployable unit.
🧠 Best Practices
- Use DataApp as the top-level structure for projects — each major product or workflow should have its own.
- Keep all related objects (ETL, ML, visualization, API) grouped under one DataApp for traceability.
- Name DataApps clearly using business context (e.g.,
SalesForecasting_App,Customer360_App). - Use Knowledge Nodes to document linked data models and dependencies.
- Integrate Notification Engines or Webhooks to automate reports, monitoring, and event triggers.
- Register the DataApp and its outcomes in the Knowledge Atlas for organization-wide visibility.
🔗 Related Documentation
- Visual Object Overview — Learn how to create notebooks and visualization layers within a DataApp.
- AI & Machine Learning Overview — Integrate trained models and inference engines into your DataApp.
- Transform & Prepare Overview — Prepare and clean data before inclusion in an application.
- Data Models (Knowledge Node) — Define logical structures and link them to your DataApp.
- Knowledge Atlas Overview — Document and connect your DataApp to organizational knowledge.
Updated 18 days ago
