Resolving Data Pipeline Failures
Step-by-step instructions for troubleshooting and recovering from common ETL processing errors.
Steps
-
Identify the Failed Task: Log in to Apache Airflow and locate the task marked in Red.
-
Review Worker Logs: Click the task node and select View Logs to identify the specific SQL or Python error.
-
Clear and Retry: Once the underlying issue (e.g., DB connectivity) is resolved, click Clear on the failed node to trigger a retry.
Metadata: * Audience: ops, developer * Doc-Type: how-to * Status: refactored