Once deployed, you can view the logs on your data-plane Airflow web UI.
Raw Feature deployment in airflow
Derived Batch Feature Deployment
Next, deploy the derived feature on top of the two deployed raw features. The system will check the successful deployment of the referenced raw features. If both raw features are successfully deployed, the derived feature will proceed; otherwise, it will fail.
For detailed explanation purposes, feature scripts are divided into multiple parts across several markdown files. However, the standard practice is to create a script containing objects of data sinks, sources, or features, and then register and deploy them all at once.
There is one way to view the logs: through the UI, with Airflow for batch features and ArgoCD for streaming features. However, you can also view the logs from the terminal using the command kubectl logs -f <pod-name> -n <namespace>. The namespace should be the one where the pods running your feature DAGs are located.