Listing BashOperator and PythonOperator won't impress anyone. See the production-grade answer with real DAG patterns that gets senior offers.
What are Airflow Operators? Give examples.
Airflow operators are the building blocks of DAGs. Each operator represents a single task. Common operators include BashOperator for running bash commands, PythonOperator for running Python functions, and EmailOperator for sending emails. There are also provider operators like BigQueryOperator and S3FileTransformOperator.
Operators define a single unit of work in a DAG. But the real skill is choosing the right operator pattern:
python@task def extract_data(): return spark.read.parquet('/raw/events').count() # Modern TaskFlow API (Airflow 2.x) replaces PythonOperator
pythonS3ToGCSOperator( task_id='s3_to_gcs', bucket='source-bucket', dest_gcs='gs://dest-bucket/', replace=True # idempotent )
pythonS3KeySensor( task_id='wait_for_file', bucket_name='data-lake', bucket_key='daily/{{ ds }}/events.parquet', mode='reschedule', # frees worker slot while waiting poke_interval=300 )
INSERT OVERWRITE, not INSERT INTO.mode='reschedule' — poke mode ties up a worker slot for hoursModern best practice: TaskFlow API with @task decorator + dynamic task mapping for scalable parallel execution.
Listing operator names is junior-level. Senior engineers explain the three categories (action/transfer/sensor), demonstrate modern TaskFlow API, and discuss the production pitfalls: idempotency, sensor modes, and XCom limits.
Paste your answer and get instant AI-powered feedback with a FAANG-level improved version.
Analyze My Answer — Free3 free analyses per day. No sign-up required.