Canso - ML Platform
  • πŸ‘‹Introduction
  • πŸ›οΈCanso Architecture
  • πŸ’»Getting Started
    • 🏁Overview
    • 🌌Provison K8s Clusters
    • 🚒Install Canso Helm Charts
    • πŸπŸ”— Canso Python Client & Web App
    • πŸ“ŠHealth Metrics for Features in the Data Plane
  • πŸ’‘Feature Store
    • Data Sources
      • Data Spans
    • Data Sinks
    • ML Features
      • Raw ML Batch Feature
      • Derived ML Batch Feature
      • Raw ML Streaming Feature
      • Custom User Defined Function
  • πŸ’‘AI Agents
    • Introduction
    • Getting Started
    • Quickstart
    • Use Cases
      • Fraud Analyst Agent
      • Agent with Memory
      • Memory command examples
    • Concepts
      • Task Server
      • Broker
      • Checkpoint DB
      • Conversation History
      • Memory
    • How Tos
      • Update the AI Agent
      • Delete the AI Agent
    • Toolkit
      • SQL Runner
      • Kubernetes Job
      • Text-to-SQL
    • API Documentation
      • Agent
      • Memory
  • πŸ’‘Risk
    • Overview
    • Workflows and Rules
    • Real Time Transaction Monitoring
    • API Documentation
  • πŸ’‘Fraud Investigation
    • API Documentation
  • πŸ“Guides
    • Registry
    • Dry Runs for Batch ML Features
    • Deployment
Powered by GitBook
On this page
  • Raw Batch Feature Deployment
  • Derived Batch Feature Deployment
  • Streaming Feature Deployment
  • Notes

Was this helpful?

  1. Guides

Deployment

PreviousDry Runs for Batch ML Features

Last updated 11 months ago

Was this helpful?

The final step is to deploy the registered feature for execution on your cluster.

Raw Batch Feature Deployment

  • First, deploy the registered raw feature. Refer to .

  • Once deployed, you can view the logs on your data-plane Airflow web UI.

Derived Batch Feature Deployment

  • Next, deploy the derived feature on top of the two deployed raw features. The system will check the successful deployment of the referenced raw features. If both raw features are successfully deployed, the derived feature will proceed; otherwise, it will fail.

Streaming Feature Deployment

  • You can trace the logs in your dataplane ArgoCD.

Notes

  • For detailed explanation purposes, feature scripts are divided into multiple parts across several markdown files. However, the standard practice is to create a script containing objects of data sinks, sources, or features, and then register and deploy them all at once.

  • There is one way to view the logs: through the UI, with Airflow for batch features and ArgoCD for streaming features. However, you can also view the logs from the terminal using the command kubectl logs -f <pod-name> -n <namespace>. The namespace should be the one where the pods running your feature DAGs are located.

To Deploy the streaming feature refer to .

πŸ“
this piece of code
this piece of code
Raw Feature deployment in airflow
Derived Feature deployment in airflow
Streaming Feature deployment in argoCD