bash - Airflow pass parameters to dependent task -
what way pass parameter dependent tasks in airflow? have lot of bashes files, , i'm trying migrate approach airflow, don't know how pass properties between tasks.
this real example:
#sqoop bash template sqoop_template = """ sqoop job --exec {{params.job}} -- --target-dir {{params.dir}} --outdir /src/ """ s3_template = """ s3-dist-cp --src= {{params.dir}} "--dest={{params.s3}} """ #task of extraction in emr t1 = bashoperator( task_id='extract_account', bash_command=sqoop_template, params={'job': 'job', 'dir': 'hdfs:///account/' + time.now().strftime("%y-%m-%d-%h-%m-%s")}, dag=dag) #task upload in s3 backup. t2 = bashoperator( task_id='s3_upload', bash_command=s3_template, params={}, #here need dir name created in t1 depends_on_past=true ) t2.set_upstream(t1)
in t2 need access dir name created in t1.
solution
#execute valid job sqoop def sqoop_import(table_name, job_name): s3, hdfs = dirpath(table_name) sqoop_job = job_default_config(job_name, hdfs) #call(sqoop_job) return {'hdfs_dir': hdfs, 's3_dir': s3} def s3_upload(**context): hdfs = context['task_instance'].xcom_pull(task_ids='sqoop_import')['hdfs_dir'] s3 = context['task_instance'].xcom_pull(task_ids='sqoop_import')['s3_dir'] s3_cpdist_job = ["s3-dist-cp", "--src=%s" % (hdfs), "--dest=%s" % (s3)] #call(s3_cpdist_job) return {'s3_dir': s3} #context['task_instance'].xcom_pull(task_ids='sqoop_import') def sns_notify(**context): s3 = context['task_instance'].xcom_pull(task_ids='distcp_s3')['s3_dir'] client = boto3.client('sns') arn = 'arn:aws:sns:us-east-1:744617668409:pipeline-notification-stg' response = client.publish(targetarn=arn, message=s3) return response
that's not definitive solution, improvements welcome. thanks.
check out xcoms - http://airflow.incubator.apache.org/concepts.html#xcoms. these used communicating state between tasks.
Comments
Post a Comment