
It works exactly as the op_args, the only difference is that instead of passing a list of values, we pass a dictionary of keywords. Notice that we could specify each argument in the function’s parameters instead of using unpacking which gives exactly the same results as shown below:Īnother way to pass parameters is through the use of op_kwargs. If we execute this DAG and go to the logs view of the task python_task like we did before, we get the following results: We print the arguments given by the PythonOperator and finally, we return the first argument from the op_args list. Then, in my_func we have the parameter op_args which is unpacked using the ‘*’. Here, we first modified the PythonOperator by adding the parameter op_args sets to a list of string values (it could be any type) since it only accepts a list of positional arguments. Python_task = PythonOperator(task_id='python_task', python_callable=my_func, op_args=) Let’s see an example of both methods using the same DAG
#Airflow xcom pythonoperator how to#
Now we know how to call a Python function, it would be very useful to know how to pass parameters as well to this function using the PythonOperator. How to pass parameters to PythonOperator in Airflow If you run the dag again with this new code, you will get following result in the logs of the task:

We could return a value just by typing below the print instruction, return my_value, where my_value can be a variable of any type we want. Notice also the log message “Returned value was: None” indicating that since we didn’t return any value from the function my_func, None is returned. Meaning, the function has been well executed using the PythonOperator. Click on the task “python_task”, then in the dialog box, click on View Log.įinally, if we take a look at the logs produced by the “python_task”, we can see that the message “Hello from my_func” has been printed as expected. Once it’s done, click on the Graph Icon as shown by the red arrow:įrom the Graph View, we can visualise the tasks composing the DAG and how they depend to each other. Now, trigger the DAG by clicking on the toggle next to the DAG’s name and let the first DAGRun to finish. From there, you should have the following screen: Next, start the webserver and the scheduler and go to the Airflow UI. Copy and paste the dag into a file python_dag.py and add it to the dags/ folder of Airflow. In order to know if the PythonOperator calls the function as expected, the message “Hello from my_func” will be printed out into the standard output each time my_func is executed.


Let’s start by looking at the following very simple DAGįrom _operator import DummyOperatorįrom _operator import PythonOperator Getting started with the PythonOperator in Airflow No obligation but if you want to help me, I will thank you a lot. One more thing, i f you like my tutorials, you can support my work by becoming my Patron right here. Indeed, mastering this operator is a must-have and that’s what we gonna learn in this post by starting with the basics. You may have seen in my course “The Complete Hands-On Course to Master Apache Airflow” that I use this operator extensively in different use cases.

It is a very simple but powerful operator, allowing you to execute a Python callable function from your DAG.
#Airflow xcom pythonoperator code#
Wondering how can we run python code through Airflow ? The Airflow PythonOperator does exactly what you are looking for.
