Deployment
The final step is to deploy the registered feature for execution on your cluster.
Raw Batch Feature Deployment
First, deploy the registered raw feature. Refer to this piece of code.
Once deployed, you can view the logs on your data-plane Airflow web UI.

Derived Batch Feature Deployment
Next, deploy the derived feature on top of the two deployed raw features. The system will check the successful deployment of the referenced raw features. If both raw features are successfully deployed, the derived feature will proceed; otherwise, it will fail.

Streaming Feature Deployment
To Deploy the streaming feature refer to this piece of code.
You can trace the logs in your dataplane ArgoCD.

Notes
For detailed explanation purposes, feature scripts are divided into multiple parts across several markdown files. However, the standard practice is to create a script containing objects of data sinks, sources, or features, and then register and deploy them all at once.
There is one way to view the logs: through the UI, with Airflow for batch features and ArgoCD for streaming features. However, you can also view the logs from the terminal using the command
kubectl logs -f <pod-name> -n <namespace>
. The namespace should be the one where the pods running your feature DAGs are located.
Last updated
Was this helpful?