Data Apps

DataApp Overview

The Reasoning Flows component for orchestrating complete, data-driven applications.


🧭 Purpose

The DataApp object represents a complete data-driven application within Reasoning Flows.
It acts as the central orchestration point where related components — such as data preparation tools, notebooks, machine learning models, APIs, and visualizations — are logically grouped into a unified structure.

By creating a DataApp, teams can structure projects into cohesive, deployable units, simplifying management, navigation, and execution across the Reasoning Flows ecosystem.


🔹 Where It Fits in Reasoning Flows

Within the Reasoning Flows architecture:

  1. Extract & Load → Ingests and structures data.
  2. Transform & Prepare → Cleans and formats data for analytics.
  3. AI & Machine Learning → Builds and trains predictive or generative models.
  4. Visual Objects → Delivers insights through notebooks, dashboards, or APIs.
  5. Data Models (Knowledge Nodes) → Define logical entities and metadata relationships.
  6. DataApp → Brings all of these components together into a single, operational application.

Goal: The DataApp is the execution and delivery layer — turning Reasoning Flows projects into deployable, interactive applications.


⚙️ Key Features

  • 🧩 Application Context
    Defines a complete data application inside Reasoning Flows, combining ETL pipelines, ML models, APIs, and visual assets.

  • 🧠 Logical Grouping
    Groups related development objects into a single project scope — enabling consistent orchestration and versioning.

  • ⚙️ Workflow Management
    Serves as a control hub for scheduling, dependency tracking, and execution order.

  • 🚀 Deployment Integration
    Can be included in deployment processes to define the boundaries and logic of an entire application.


💡 Recommended Use Cases

  • Building and deploying a data product or AI-powered service (e.g., forecasting app, semantic search API).
  • Structuring multi-component solutions that involve data pipelines, model training, and inference endpoints.
  • Managing application-level dependencies across diverse Reasoning Flows objects.
  • Creating a logical hierarchy for reusable assets within enterprise-scale environments.

🖼️ Visual Example

DataApp Overview

Example: A DataApp connecting ETL pipelines, prepared tables, AutoML models, and API endpoints into a single deployable unit.


🧠 Best Practices

  • Use DataApp as the top-level structure for projects — each major product or workflow should have its own.
  • Keep all related objects (ETL, ML, visualization, API) grouped under one DataApp for traceability.
  • Name DataApps clearly using business context (e.g., SalesForecasting_App, Customer360_App).
  • Use Knowledge Nodes to document linked data models and dependencies.
  • Integrate Notification Engines or Webhooks to automate reports, monitoring, and event triggers.
  • Register the DataApp and its outcomes in the Knowledge Atlas for organization-wide visibility.

🔗 Related Documentation