status
stringclasses 1
value | repo_name
stringclasses 13
values | repo_url
stringclasses 13
values | issue_id
int64 1
104k
| updated_files
stringlengths 11
1.76k
| title
stringlengths 4
369
| body
stringlengths 0
254k
⌀ | issue_url
stringlengths 38
55
| pull_url
stringlengths 38
53
| before_fix_sha
stringlengths 40
40
| after_fix_sha
stringlengths 40
40
| report_datetime
unknown | language
stringclasses 5
values | commit_datetime
unknown |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed | apache/airflow | https://github.com/apache/airflow | 21,348 | ["airflow/providers/amazon/aws/operators/glue.py"] | Status of testing Providers that were prepared on February 05, 2022 | ### Body
I have a kind request for all the contributors to the latest provider packages release.
Could you please help us to test the RC versions of the providers?
Let us know in the comment, whether the issue is addressed.
Those are providers that require testing as there were some substantial changes introduced:
## Provider [amazon: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-amazon/3.0.0rc1)
- [ ] [Rename params to cloudformation_parameter in CloudFormation operators. (#20989)](https://github.com/apache/airflow/pull/20989): @potiuk
- [ ] [[SQSSensor] Add opt-in to disable auto-delete messages (#21159)](https://github.com/apache/airflow/pull/21159): @LaPetiteSouris
- [x] [Create a generic operator SqlToS3Operator and deprecate the MySqlToS3Operator. (#20807)](https://github.com/apache/airflow/pull/20807): @mariotaddeucci
- [ ] [Move some base_aws logging from info to debug level (#20858)](https://github.com/apache/airflow/pull/20858): @o-nikolas
- [ ] [Adds support for optional kwargs in the EKS Operators (#20819)](https://github.com/apache/airflow/pull/20819): @ferruzzi
- [ ] [AwsAthenaOperator: do not generate client_request_token if not provided (#20854)](https://github.com/apache/airflow/pull/20854): @XD-DENG
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [fix: cloudwatch logs fetch logic (#20814)](https://github.com/apache/airflow/pull/20814): @ayushchauhan0811
- [ ] [Alleviate import warning for `EmrClusterLink` in deprecated AWS module (#21195)](https://github.com/apache/airflow/pull/21195): @josh-fell
- [ ] [Rename amazon EMR hook name (#20767)](https://github.com/apache/airflow/pull/20767): @vinitpayal
- [ ] [Standardize AWS SQS classes names (#20732)](https://github.com/apache/airflow/pull/20732): @eladkal
- [ ] [Standardize AWS Batch naming (#20369)](https://github.com/apache/airflow/pull/20369): @ferruzzi
- [ ] [Standardize AWS Redshift naming (#20374)](https://github.com/apache/airflow/pull/20374): @ferruzzi
- [ ] [Standardize DynamoDB naming (#20360)](https://github.com/apache/airflow/pull/20360): @ferruzzi
- [ ] [Standardize AWS ECS naming (#20332)](https://github.com/apache/airflow/pull/20332): @ferruzzi
- [ ] [Refactor operator links to not create ad hoc TaskInstances (#21285)](https://github.com/apache/airflow/pull/21285): @josh-fell
## Provider [apache.druid: 2.3.0rc1](https://pypi.org/project/apache-airflow-providers-apache-druid/2.3.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [apache.hive: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-apache-hive/2.2.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [apache.spark: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-apache-spark/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [apache.sqoop: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-apache-sqoop/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [cncf.kubernetes: 3.0.2rc1](https://pypi.org/project/apache-airflow-providers-cncf-kubernetes/3.0.2rc1)
- [ ] [Add missed deprecations for cncf (#20031)](https://github.com/apache/airflow/pull/20031): @dimon222
## Provider [docker: 2.4.1rc1](https://pypi.org/project/apache-airflow-providers-docker/2.4.1rc1)
- [ ] [Fixes Docker xcom functionality (#21175)](https://github.com/apache/airflow/pull/21175): @ferruzzi
## Provider [exasol: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-exasol/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [google: 6.4.0rc1](https://pypi.org/project/apache-airflow-providers-google/6.4.0rc1)
- [ ] [[Part 1]: Add hook for integrating with Google Calendar (#20542)](https://github.com/apache/airflow/pull/20542): @rsg17
- [ ] [Add encoding parameter to `GCSToLocalFilesystemOperator` to fix #20901 (#20919)](https://github.com/apache/airflow/pull/20919): @danneaves-ee
- [ ] [batch as templated field in DataprocCreateBatchOperator (#20905)](https://github.com/apache/airflow/pull/20905): @wsmolak
- [ ] [Make timeout Optional for wait_for_operation (#20981)](https://github.com/apache/airflow/pull/20981): @MaksYermak
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [Cloudsql import links fix. (#21199)](https://github.com/apache/airflow/pull/21199): @subkanthi
- [ ] [Refactor operator links to not create ad hoc TaskInstances (#21285)](https://github.com/apache/airflow/pull/21285): @josh-fell
## Provider [http: 2.0.3rc1](https://pypi.org/project/apache-airflow-providers-http/2.0.3rc1)
- [ ] [Split out confusing path combination logic to separate method (#21247)](https://github.com/apache/airflow/pull/21247): @malthe
## Provider [imap: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-imap/2.2.0rc1)
- [ ] [Add "use_ssl" option to IMAP connection (#20441)](https://github.com/apache/airflow/pull/20441): @feluelle
## Provider [jdbc: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-jdbc/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [microsoft.azure: 3.6.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-azure/3.6.0rc1)
- [ ] [Refactor operator links to not create ad hoc TaskInstances (#21285)](https://github.com/apache/airflow/pull/21285): @josh-fell
## Provider [microsoft.mssql: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-mssql/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [microsoft.psrp: 1.1.0rc1](https://pypi.org/project/apache-airflow-providers-microsoft-psrp/1.1.0rc1)
- [x] [PSRP improvements (#19806)](https://github.com/apache/airflow/pull/19806): @malthe
## Provider [mysql: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-mysql/2.2.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
(https://github.com/apache/airflow/pull/20618): @potiuk
## Provider [oracle: 2.2.0rc1](https://pypi.org/project/apache-airflow-providers-oracle/2.2.0rc1)
- [x] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [x] [Fix handling of Oracle bindvars in stored procedure call when parameters are not provided (#20720)](https://github.com/apache/airflow/pull/20720): @malthe
## Provider [postgres: 3.0.0rc1](https://pypi.org/project/apache-airflow-providers-postgres/3.0.0rc1)
- [ ] [Replaces the usage of postgres:// with postgresql:// (#21205)](https://github.com/apache/airflow/pull/21205): @potiuk
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [Remove `:type` lines now sphinx-autoapi supports typehints (#20951)](https://github.com/apache/airflow/pull/20951): @ashb
- [ ] [19489 - Pass client_encoding for postgres connections (#19827)](https://github.com/apache/airflow/pull/19827): @subkanthi
- [ ] [Amazon provider remove deprecation, second try (#19815)](https://github.com/apache/airflow/pull/19815): @uranusjr
## Provider [qubole: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-qubole/2.1.0rc1)
- [x] [Add Qubole how to documentation (#20058)](https://github.com/apache/airflow/pull/20058): @kazanzhy
## Provider [slack: 4.2.0rc1](https://pypi.org/project/apache-airflow-providers-slack/4.2.0rc1)
- [ ] [Return slack api call response in slack_hook (#21107)](https://github.com/apache/airflow/pull/21107): @pingzh
(https://github.com/apache/airflow/pull/20571): @potiuk
## Provider [snowflake: 2.5.0rc1](https://pypi.org/project/apache-airflow-providers-snowflake/2.5.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
- [ ] [Fix #21096: Support boolean in extra__snowflake__insecure_mode (#21155)](https://github.com/apache/airflow/pull/21155): @mik-laj
## Provider [sqlite: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-sqlite/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
## Provider [ssh: 2.4.0rc1](https://pypi.org/project/apache-airflow-providers-ssh/2.4.0rc1)
- [ ] [Add a retry with wait interval for SSH operator (#14489)](https://github.com/apache/airflow/issues/14489): @Gaurang033
- [ ] [Add banner_timeout feature to SSH Hook/Operator (#21262)](https://github.com/apache/airflow/pull/21262): @potiuk
## Provider [tableau: 2.1.4rc1](https://pypi.org/project/apache-airflow-providers-tableau/2.1.4rc1)
- [ ] [Squelch more deprecation warnings (#21003)](https://github.com/apache/airflow/pull/21003): @uranusjr
## Provider [vertica: 2.1.0rc1](https://pypi.org/project/apache-airflow-providers-vertica/2.1.0rc1)
- [ ] [Add more SQL template fields renderers (#21237)](https://github.com/apache/airflow/pull/21237): @josh-fell
### Committer
- [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. | https://github.com/apache/airflow/issues/21348 | https://github.com/apache/airflow/pull/21353 | 8da7af2bc0f27e6d926071439900ddb27f3ae6c1 | d1150182cb1f699e9877fc543322f3160ca80780 | "2022-02-05T20:59:27Z" | python | "2022-02-06T21:25:29Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,336 | ["airflow/www/templates/airflow/trigger.html", "airflow/www/views.py", "tests/www/views/test_views_trigger_dag.py"] | Override the dag run_id from within the ui | ### Description
It would be great to have the ability to override the generated run_ids like `scheduled__2022-01-27T14:00:00+00:00` so that it is easier to find specific dag runs in the ui. I know the rest api allows you to specify a run id, but it would be great if ui users could also specify a run_id using for example dag_run conf.
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21336 | https://github.com/apache/airflow/pull/21851 | 340180423a687d8171413c0c305f2060f9722177 | 14a2d9d0078569988671116473b43f86aba1161b | "2022-02-04T21:10:04Z" | python | "2022-03-16T08:12:59Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,325 | ["airflow/providers/cncf/kubernetes/hooks/kubernetes.py", "airflow/providers/cncf/kubernetes/operators/spark_kubernetes.py", "tests/providers/apache/flink/operators/test_flink_kubernetes.py", "tests/providers/cncf/kubernetes/hooks/test_kubernetes_pod.py", "tests/providers/cncf/kubernetes/operators/test_spark_kubernetes.py"] | on_kill method for SparkKubernetesOperator | ### Description
In some cases the Airflow sends `SIGTERM` to the task, here to the `SparkKubernetesOperator`, it needs to send `SIGTERM` also to the corresponding pods/jobs
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21325 | https://github.com/apache/airflow/pull/29977 | feab21362e2fee309990a89aea39031d94c5f5bd | 9a4f6748521c9c3b66d96598036be08fd94ccf89 | "2022-02-04T13:31:39Z" | python | "2023-03-14T22:31:30Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,321 | ["airflow/providers/amazon/aws/example_dags/example_ecs_ec2.py", "airflow/providers/amazon/aws/example_dags/example_ecs_fargate.py", "airflow/providers/amazon/aws/operators/ecs.py", "docs/apache-airflow-providers-amazon/operators/ecs.rst", "tests/providers/amazon/aws/operators/test_ecs.py"] | ECS Operator does not support launch type "EXTERNAL" | ### Description
You can run ECS tasks either on EC2 instances or via AWS Fargate, and these will run in AWS. With ECS Anywhere, you are now able to run the same ECS tasks on any host that has the ECS agent - on prem, in another cloud provider, etc. The control plane resides in ECS, but the execution of the task is managed by the ECS agent.
To launch tasks on hosts that are managed by ECS Anywhere, you need to specify a launch type of EXTERNAL. This is currently not supported by the ECS Operator. When you attempt do this, you get an error of unsupported launch type.
The current work around is to use boto3 and create a task and then run it using the correct parameters.
### Use case/motivation
The ability to run your task code to support hybrid and multi-cloud orchestration scenarios.
### Related issues
_No response_
### Are you willing to submit a PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21321 | https://github.com/apache/airflow/pull/22093 | 33ecca1b9ab99d9d15006df77757825c81c24f84 | e63f6e36d14a8cd2462e80f26fb4809ab8698380 | "2022-02-04T11:23:52Z" | python | "2022-03-11T07:25:11Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,302 | ["airflow/www/package.json", "airflow/www/static/js/graph.js", "airflow/www/static/js/tree/Tree.jsx", "airflow/www/static/js/tree/dagRuns/index.test.jsx", "airflow/www/static/js/tree/index.jsx", "airflow/www/static/js/tree/renderTaskRows.jsx", "airflow/www/static/js/tree/renderTaskRows.test.jsx", "airflow/www/static/js/tree/useTreeData.js", "airflow/www/static/js/tree/useTreeData.test.jsx", "airflow/www/yarn.lock"] | Pause auto-refresh when the document becomes hidden | ### Description
When running Airflow it can be common to leave some tabs of Airflow open but not active. I believe (but not 100% sure, if I am wrong I can close this issue) Airflow's auto-refresh keeps refreshing when the document becomes hidden (for example, you switched to another browser tab).
This is not desirable in the cases when you are running the Airflow services on your same machine and you have a long-running DAG (taking hours to run). This could cause your CPU utilization to ramp up in this scenario (which can be quite common for users, myself included):
1. You are running the Airflow services on your same machine
2. Your machine is not that powerful
3. You have a long-running DAG (taking hours to run)
4. You leave a auto-refreshing page(s) of that DAG open for a long time (such as tree or graph) in hidden (or non-focused) tabs of your browser
- What can make this even worse is if you have multiple tabs like this open, you are multiplying the extra processing power to refresh the page at a short interval
5. You have not increased the default `auto_refresh_interval` of 3
### Use case/motivation
I am proposing the following improvements to the auto-refresh method to improve this situation:
1. When you change tabs in your browser, there is a feature of Javascript in modern browsers called "Page Visibility API". It allows for the use of listeners on a `visibilitychange` event to know when a document becomes visible or hidden. This can be used to pause auto-refresh when the document becomes hidden.
- Discussion on Stack Overflow: https://stackoverflow.com/questions/1060008/is-there-a-way-to-detect-if-a-browser-window-is-not-currently-active
- MDN: https://developer.mozilla.org/en-US/docs/Web/API/Page_Visibility_API
- W3C: https://www.w3.org/TR/page-visibility/
2. We should provide a message in the UI to alert the user that the auto-refreshing is paused until the page regains focus.
3. Lastely, the option to only auto-refresh if the document is visible should be a configurable setting.
Additionally, the older `onblur` and `onfocus` listeners on the entire document could be used too. That way if a user switches to a different window while the page is still visible, the auto-refresh can pause (although this might not be desirable if you want to have Airflow open side-by-side with something else, so maybe this will be overboard)
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21302 | https://github.com/apache/airflow/pull/21904 | dfd9805a23b2d366f5c332f4cb4131462c5ba82e | 635fe533700f284da9aa04a38a5dae9ad6485454 | "2022-02-03T19:57:20Z" | python | "2022-03-08T18:31:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,201 | ["airflow/www/static/js/gantt.js", "airflow/www/static/js/graph.js", "airflow/www/static/js/task_instances.js", "airflow/www/views.py"] | Add Trigger Rule Display to Graph View | ### Description
This feature would introduce some visual addition(s) (e.g. tooltip) to the Graph View to display the trigger rule between tasks.
### Use case/motivation
This would add more detail to Graph View, providing more information visually about the relationships between upstream and downstream tasks.
### Related issues
https://github.com/apache/airflow/issues/19939
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21201 | https://github.com/apache/airflow/pull/26043 | bdc3d4da3e0fb11661cede149f2768acb2080d25 | f94176bc7b28b496c34974b6e2a21781a9afa221 | "2022-01-28T23:17:52Z" | python | "2022-08-31T19:51:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,188 | ["airflow/www/static/js/connection_form.js"] | "Test Connection" functionality in the UI doesn't consider custom form fields | ### Apache Airflow version
main (development)
### What happened
When creating a connection to Snowflake in the Airflow UI, I was hoping to test the connection prior to saving the connection. Unfortunately when I click on the Test button I receive the following error despite all fields in the connection form (especially Account) being provided.

However, when I specify the connection parameters directly in the Extra field, the connection test is successful.

### What you expected to happen
I would have expected that I could use the custom fields in the connection form to test the connection. While using the Extra field is a workaround for the Snowflake connection type, not all custom connection forms expose the Extra field (e.g. Azure Data Factory, Salesforce, etc.) which makes this workaround impossible when testing in the UI.
### How to reproduce
- Attempt to create a Snowflake connection (or other connection types that have the `test_connection()` method implemented which use custom fields for authentication).
- Click the Test button in the UI.
### Operating System
Debian GNU/Linux 10 (buster
### Versions of Apache Airflow Providers
N/A - using `main` branch.
### Deployment
Other
### Deployment details
Testing with `main` using Breeze.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21188 | https://github.com/apache/airflow/pull/21330 | fc44836504129664edb81c510e6deb41a7e1126d | a9b8ac5e0dde1f1793687a035245fde73bd146d4 | "2022-01-28T15:19:40Z" | python | "2022-02-15T19:00:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,183 | ["airflow/sensors/external_task.py", "tests/sensors/test_external_task_sensor.py"] | Webserver "External DAG" button on ExternalTaskSensor not working when dag_id is templated | ### Apache Airflow version
2.0.2
### What happened
When an ExternalTaskSensor receives a templated dag_id , the web interface's "External DAG" button does not resolve the template, so the destination URL does not work (although the sensor correctly waits for the dag_id that the template refers to)



### What you expected to happen
The button's destination URL should point to the templated dag_id.
### How to reproduce
1. Create a DAG with an ExternalTaskSensor whose dag_id is templated
```python
@provide_session
def get_last_run_id(execution_date: datetime, session: Any, **context: Any) -> datetime:
dag_id = context['task_instance'].xcom_pull('get_idemp', key='id_emp')
while (last_run := get_last_dagrun(dag_id, session, include_externally_triggered=True)) is None:
continue
return last_run.execution_date
sensor = ExternalTaskSensor(
task_id="wait_for_dag",
external_dag_id="{{ ti.xcom_pull('get_idemp', key='id_emp') }}",
external_task_id="update_load_registry", <---------- Last DAG operator
execution_date_fn=get_last_run_id
)
```
2. Trigger the created DAG
3. Click on the ExternalTaskSensor operator
4. Click on the "External DAG" button
### Operating System
Debian GNU/Linux 10 (buster) (on KubernetesExecutor)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21183 | https://github.com/apache/airflow/pull/21192 | 6b88d432d959df73433528fe3d62194239f13edd | 8da7af2bc0f27e6d926071439900ddb27f3ae6c1 | "2022-01-28T12:18:30Z" | python | "2022-02-06T21:14:21Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,105 | ["BREEZE.rst", "CONTRIBUTING.rst", "CONTRIBUTORS_QUICK_START.rst", "dev/provider_packages/README.md", "docs/apache-airflow/installation/installing-from-pypi.rst", "scripts/tools/initialize_virtualenv.py"] | Breeze: Setting up local virtualenv | There shoudl be a comand that allows to set-up local virtualenv easly.
This involves:
* checking is airlfow is installed in "${HOME}/airflow" and warning and suggesting to move elsewhere if so (this is very bad because ${HOME}/airflow is by default where airflow stores all files (logs/config etc).
* cleaning the "${HOME}/airflow" and regenerating all necessary folders and files
* checking if vitualenv is activated - if not writing helpful message
* checking if additional dependencies are installed, and based on the OS, suggest what shoudl be installed (note we do not have Windows support here).
* installing aiflow with the right extra ([devel-all] I think )
* initializing the sqlite databases of airflow - both "normal" and "unit test" database | https://github.com/apache/airflow/issues/21105 | https://github.com/apache/airflow/pull/22971 | 03f7d857e940b9c719975e72ded4a89f183b0100 | 03bef084b3f1611e1becdd6ad0ff4c0d2dd909ac | "2022-01-25T16:14:33Z" | python | "2022-04-21T13:59:03Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,104 | ["BREEZE.rst", "dev/breeze/src/airflow_breeze/commands/developer_commands.py", "dev/breeze/src/airflow_breeze/shell/enter_shell.py", "images/breeze/output-commands.svg", "images/breeze/output-exec.svg"] | Breeze: Exec'ing into running Breeze | `./Breeze2 exec` should exec into the currently running Breeze (or fail with helpful message if Breeze is not running). | https://github.com/apache/airflow/issues/21104 | https://github.com/apache/airflow/pull/23052 | b6db0e90aeb30133086716a433cab9dca7408a54 | 94c3203e86252ed120d624a70aed571b57083ea4 | "2022-01-25T16:09:29Z" | python | "2022-04-28T19:31:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,102 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/ci/build_image.py", "dev/breeze/src/airflow_breeze/ci/build_params.py", "dev/breeze/src/airflow_breeze/prod/prod_params.py"] | Breeze: Add 'prepare-image-cache' in Breeze | We have a separate command that prepares image caches. It is very similar to "building image" but:
* it should prepare both prod and CI images
* the `docker build` command should be slighly modified (--cache-to and some other commands)
* validation on whether the `buildx plugin` needs to be performed (and command should fail if not with helpful message)
Those "differences" between standard build-image can be found with `PREPARE_BUILDX_CACHE` variable == "true" usage in old breeze | https://github.com/apache/airflow/issues/21102 | https://github.com/apache/airflow/pull/22344 | 4e4c0574cdd3689d22e2e7d03521cb82179e0909 | dc75f5d8768c8a42df29c86beb519de282539e1f | "2022-01-25T16:03:43Z" | python | "2022-04-01T08:47:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,100 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/ci/build_params.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/prod/__init__.py", "dev/breeze/src/airflow_breeze/prod/build_prod_image.py", "dev/breeze/src/airflow_breeze/prod/prod_params.py", "dev/breeze/src/airflow_breeze/utils/docker_command_utils.py", "dev/breeze/src/airflow_breeze/utils/path_utils.py", "dev/breeze/src/airflow_breeze/utils/run_utils.py", "dev/breeze/tests/test_prod_image.py"] | Breeze: Build PROD image with Breeze | Similarly to building CI image, we should build PROD image | https://github.com/apache/airflow/issues/21100 | https://github.com/apache/airflow/pull/21956 | 7418720ce173ca5d0c5f5197c168e43258af8cc3 | 4eebabb76d1d50936c4b63669a93358f4d100ce3 | "2022-01-25T15:53:50Z" | python | "2022-03-29T15:28:42Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,083 | ["airflow/models/dagrun.py", "tests/jobs/test_scheduler_job.py"] | A high value of min_file_process_interval & max_active_runs=1 causes stuck dags | ### Apache Airflow version
2.2.2
### What happened
When the value of `AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL` is set to 86400, all dags whose `MAX_ACTIVE_RUNS` is set to 1 stop executing & remains stuck forever. If the `MAX_ACTIVE_RUNS` is set to 2 or above, or `AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL` is set to a lower value (bw 30-300), dags work just fine.
### What you expected to happen
These 2 settings should be exclusive to each other and there should be no direct impact of one another.
### How to reproduce
set AIRFLOW__SCHEDULER__MIN_FILE_PROCESS_INTERVAL to 86400
set MAX_ACTIVE_RUNS to 1 on any dag & observe its execution dates.
### Operating System
Debian GNU/Linux 11 (bullseye)
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==1!2.3.0
apache-airflow-providers-cncf-kubernetes==1!2.1.0
apache-airflow-providers-elasticsearch==1!2.1.0
apache-airflow-providers-ftp==1!2.0.1
apache-airflow-providers-google==1!6.1.0
apache-airflow-providers-http==1!2.0.1
apache-airflow-providers-imap==1!2.0.1
apache-airflow-providers-microsoft-azure==1!3.3.0
apache-airflow-providers-mysql==1!2.1.1
apache-airflow-providers-postgres==1!2.3.0
apache-airflow-providers-redis==1!2.0.1
apache-airflow-providers-slack==1!4.1.0
apache-airflow-providers-sqlite==1!2.0.1
apache-airflow-providers-ssh==1!2.3.0
### Deployment
Astronomer
### Deployment details
Deployed on AKS via Astronomer Helm chart.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21083 | https://github.com/apache/airflow/pull/21413 | 5fbf2471ab4746f5bc691ff47a7895698440d448 | feea143af9b1db3b1f8cd8d29677f0b2b2ab757a | "2022-01-25T05:49:42Z" | python | "2022-02-24T07:12:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,057 | ["airflow/providers/amazon/aws/transfers/dynamodb_to_s3.py"] | Templated fields for DynamoDBToS3Operator similar to MySQLToS3Operator | ### Description
I am using Airflow to move data periodically to our datalake and noticed that the MySQLToS3Operator has tempalted fields and the DynamoDBToS3Operator doesn't. I found a semi awkward workaround but thought templated fields would be nice.
I supposed an implementation could be as simple as adding
template_fields = (
's3_bucket',
's3_key',
)
to the Operator similar to the MySQLToS3Operator.
Potentially one could also add a log call in execute()
as well as using with in the NamedTemporaryFile
### Use case/motivation
At the moment MySQLToS3Operator and DynamoDBToS3Operator behave differently in terms of templating and some of the code in MySQLtoS3Operator seems more refined.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21057 | https://github.com/apache/airflow/pull/22080 | c8d49f63ca60fa0fb447768546c2503b746a66dd | a150ee0bc124f21b99fa94adbb16e6ccfe654ae4 | "2022-01-24T08:36:19Z" | python | "2022-03-08T13:09:07Z" |
closed | apache/airflow | https://github.com/apache/airflow | 21,036 | ["airflow/www/views.py", "tests/www/views/test_views.py"] | Recent Tasks mixes status for current and old dag runs | ### Apache Airflow version
2.1.4
### What happened
Recent Tasks column in /home dashboard is showing status of the task for different dag runs.
See images attached:


### What you expected to happen
As stated in the tooltip, I expect the Recent Tasks column to show the status of tasks for the last run if the DAG isn't currently running, or for the current DAG run (only) if there's one in progress.
### How to reproduce
- trigger a manual ran in a dag, make any task fail and get some tasks marked as `failed` and `upstream_failed`.
- then trigger the dag again, and see how the Recent Tasks column in /home dashboard is showing tasks failed and upstream_failed from previous run along with running/completed from the current dag run in progress.
### Operating System
using apache/airflow:2.1.4-python3.8 image
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/21036 | https://github.com/apache/airflow/pull/21352 | 28378d867afaac497529bd2e1d2c878edf66f460 | 28d7bde2750c38300e5cf70ba32be153b1a11f2c | "2022-01-22T21:34:43Z" | python | "2022-02-14T15:55:00Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,999 | ["docs/helm-chart/index.rst"] | ArgoCD deployment: build in redis is not restarted on password change | ### Official Helm Chart version
1.4.0 (latest released)
### Apache Airflow version
2.2.3 (latest released)
### Kubernetes Version
1.21.x,1.18.x
### Helm Chart configuration
KubernetesCeleryExecutor used
### Docker Image customisations
_No response_
### What happened
On each argocd airflow app update ```{{ .Release.Name }}-redis-password``` and ```{{ .Release.Name }}-broker-url``` is regenerated(since argocd do not honor ```"helm.sh/hook": "pre-install"```). airflow pods restarted(as expected),get new redis connection and connection start failing with WRONG_PASSWORD, since redis is not restarted(old password used). Redis in chart 1.4 have no health checks.
### What you expected to happen
Generally, I have two options:
1. (Prefered) Add health check to Redis with login sequence. The password should be updated "on the fly"(read from the mounted secret and try to connect)
2. (Workeraund) I implemented a workaround with the parent chart. ```{{ .Release.Name }}-Redis-password``` and ```{{ .Release.Name }}-broker-url``` secrets generated from template where ```immutable: true``` added to the secret definition.
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20999 | https://github.com/apache/airflow/pull/29078 | 30ad26e705f50442f05dd579990372196323fc86 | 6c479437b1aedf74d029463bda56b42950278287 | "2022-01-20T20:57:48Z" | python | "2023-01-27T20:58:56Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,993 | ["airflow/models/dag.py", "tests/models/test_dag.py"] | DagFileProcessor 'NoneType' is not iterable | ### Apache Airflow version
2.2.2
### What happened
I'm seeing the same log repeating in the Scheduler.
I'm working in a restricted network so I cannot bring the entire log:
```
in bulk_write_to_db
if orm_tag.name not in set(dag.tags)
TypeError: 'NoneType' object is not iterable
```
I saw that a single DAG didn't have any labels and i tried to add a label but the log is still showing
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Debian 10 (Scheduler image)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
I'm deploying on OpenShift 4.8 using the official Helm Chart v1.3.0
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20993 | https://github.com/apache/airflow/pull/21757 | 768d851ca995bbe46cfdaeed7c46a51201b723e2 | d7265791187fb2117dfd090cdb7cce3f8c20866c | "2022-01-20T17:21:52Z" | python | "2022-02-28T00:35:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,966 | ["airflow/hooks/subprocess.py", "airflow/providers/cncf/kubernetes/utils/pod_manager.py", "tests/hooks/test_subprocess.py"] | Exception when parsing log | ### Apache Airflow version
2.1.4
### What happened
[2022-01-19 13:42:46,107] {taskinstance.py:1463} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1165, in _run_raw_task
self._prepare_and_execute_task_with_callbacks(context, task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1283, in _prepare_and_execute_task_with_callbacks
result = self._execute_task(context, task_copy)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1313, in _execute_task
result = task_copy.execute(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 367, in execute
final_state, remote_pod, result = self.create_new_pod_for_operator(labels, launcher)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/cncf/kubernetes/operators/kubernetes_pod.py", line 521, in create_new_pod_for_operator
final_state, remote_pod, result = launcher.monitor_pod(pod=self.pod, get_logs=self.get_logs)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/cncf/kubernetes/utils/pod_launcher.py", line 148, in monitor_pod
timestamp, message = self.parse_log_line(line.decode('utf-8'))
UnicodeDecodeError: 'utf-8' codec can't decode bytes in position 16413-16414: invalid continuation byte
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.2.0
apache-airflow-providers-celery==2.0.0
apache-airflow-providers-cncf-kubernetes==2.0.2
apache-airflow-providers-docker==2.1.1
apache-airflow-providers-elasticsearch==2.0.3
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==5.1.0
apache-airflow-providers-grpc==2.0.1
apache-airflow-providers-hashicorp==2.1.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-azure==3.1.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.2.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sendgrid==2.0.1
apache-airflow-providers-sftp==2.1.1
apache-airflow-providers-slack==4.0.1
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.1.1
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### Anything else
Always on a specific docker container.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20966 | https://github.com/apache/airflow/pull/23301 | c5b72bf30c8b80b6c022055834fc7272a1a44526 | 863b2576423e1a7933750b297a9b4518ae598db9 | "2022-01-19T21:10:09Z" | python | "2022-05-10T20:43:25Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,901 | ["airflow/providers/google/cloud/transfers/gcs_to_local.py", "tests/providers/google/cloud/transfers/test_gcs_to_local.py"] | Bytes cast to String in `airflow.providers.google.cloud.transfers.gcs_to_local.GCSToLocalFilesystemOperator` ~142 | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
```
$ pip freeze | grep apache-airflow-providers
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==6.3.0
apache-airflow-providers-http==2.0.2
apache-airflow-providers-imap==2.1.0
apache-airflow-providers-pagerduty==2.1.0
apache-airflow-providers-sftp==2.4.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
```
### Apache Airflow version
2.2.3 (latest released)
### Operating System
Ubuntu 20.04.3 LTS
### Deployment
Composer
### Deployment details
_No response_
### What happened
Using `airflow.providers.google.cloud.transfers.gcs_to_local.GCSToLocalFilesystemOperator` to load the contents of a file into `xcom` unexpectedly casts the file bytes to string.
### What you expected to happen
`GCSToLocalFilesystemOperator` should not cast to string
### How to reproduce
Store a file on gcs;
```
Hello World!
```
Read file to xcom
```
my_task = GCSToLocalFilesystemOperator(
task_id='my_task',
bucket=bucket,
object_name=object_path,
store_to_xcom_key='my_xcom_key',
)
```
Access via jinja;
```
{{ ti.xcom_pull(task_ids="my_task", key="my_xcom_key") }}
```
XCom result is;
```
b'Hello World!'
```
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20901 | https://github.com/apache/airflow/pull/20919 | b171e03924fba92924162563f606d25f0d75351e | b8526abc2c220b1e07eed83694dfee972c2e2609 | "2022-01-17T10:04:57Z" | python | "2022-01-19T11:39:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,877 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/www/views.py", "docs/apache-airflow/howto/customize-ui.rst", "tests/www/views/test_views_base.py"] | Allow for Markup in UI page title | ### Description
A custom page title can be set on the UI with the `instance_name` variable in `airflow.cfg`. It would be nice to have the option to include Markup text in that variable for further customization of the title, similar to how dashboard alerts introduced in #18284 allow for `html=True`.
### Use case/motivation
It would be useful to be able to use formatting like underline, bold, color, etc, in the UI title. For example, color-coded environments to minimize chances of a developer triggering a DAG in the wrong environment:

### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20877 | https://github.com/apache/airflow/pull/20888 | 75755d7f65fb06c6e2e74f805b877774bfa7fcda | a65555e604481388da40cea561ca78f5cabb5f50 | "2022-01-14T14:10:45Z" | python | "2022-01-19T12:44:26Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,876 | ["airflow/migrations/versions/e655c0453f75_add_taskmap_and_map_id_on_taskinstance.py", "airflow/models/taskinstance.py", "airflow/models/taskreschedule.py"] | Airflow database upgrade fails with "psycopg2.errors.NotNullViolation: column "map_index" of relation "task_instance" contains null value"s | ### Apache Airflow version
main (development)
### What happened
I currently have Airflow 2.2.3 and due to this [issue](https://github.com/apache/airflow/issues/19699) I have tried to upgrade Airflow to this [commit](https://github.com/apache/airflow/commit/14ee831c7ad767e31a3aeccf3edbc519b3b8c923).
When I run `airflow db upgrade` I get the following error:
```
INFO [alembic.runtime.migration] Context impl PostgresqlImpl.
INFO [alembic.runtime.migration] Will assume transactional DDL.
INFO [alembic.runtime.migration] Running upgrade 587bdf053233 -> e655c0453f75, Add TaskMap and map_index on TaskInstance.
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.7/dist-packages/sqlalchemy/engine/default.py", line 608, in do_execute
cursor.execute(statement, parameters)
psycopg2.errors.NotNullViolation: column "map_index" of relation "task_instance" contains null values
```
The map_index column was introduced with this [PR](https://github.com/apache/airflow/pull/20286).
Could you please advise?
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
Ubuntu 18.04.6 LTS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other
### Deployment details
Kubernetes
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20876 | https://github.com/apache/airflow/pull/20902 | 7e29506037fa543f5d9b438320db064cdb820c7b | 66276e68ba37abb2991cb0e03ca93c327fc63a09 | "2022-01-14T14:10:05Z" | python | "2022-01-18T04:55:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,873 | ["airflow/www/static/css/bootstrap-theme.css"] | Distinguish links in DAG documentation from code | ### Description
Currently code blocks (i.e using backticks in markdown) look the same as links when using `dag.doc_md`.
This makes it very difficult to distinguish what is clickable in links.
For example in the image below,

`document-chunker` is a code block, whereas `offline-processing-storage-chunking-manual-trigger-v5` is a link - but they appear identical.
If this was rendered on github it'd look like:
> The restored files must subsequently be restored using a manual trigger of `document-chunker`: [offline-processing-storage-chunking-manual-trigger-v5](https://invalid-link.com), set `s3_input_base_path` as copied and provide the same start and end dates.
notice that the link is clearly different to the code block.
We're using Airflow version 2.1.4.
### Use case/motivation
We're trying to link between manual steps which are possible with Airflow runs for recovery, reducing the need to write long pieces of documentation.
The links are currently difficult to distinguish which causes issues when following instructions.
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20873 | https://github.com/apache/airflow/pull/20938 | 892204105154fdc520758e66512fb6021d404e57 | cdb120de2403f5a21aa6d84d10f68c1b7f086aba | "2022-01-14T12:32:24Z" | python | "2022-01-19T16:18:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,839 | ["airflow/www/views.py", "tests/www/views/test_views_connection.py"] | Cannot edit custom fields on provider connections | ### Apache Airflow version
2.2.3 (latest released)
### What happened
Connections from providers are not saving edited values in any custom connection forms. You can work around the issue by changing connection type to something like HTTP, and modifying the extra field's JSON.
### What you expected to happen
_No response_
### How to reproduce
Using the official docker compose deployment, add a new connection of type 'Azure Data Explorer' and fill out one of the custom connection fields (e.g., "Tenant ID"). Save the record. Edit the record and enter a new value for the same field or any other field that is defined in the associated Hook's `get_connection_form_widgets` function. Save the record. Edit again. The changes were not saved.
### Operating System
Windows 10, using docker
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20839 | https://github.com/apache/airflow/pull/20883 | bad070f7f484a9b4065a0d86195a1e8002d9bfef | 44df1420582b358594c8d7344865811cff02956c | "2022-01-13T01:05:58Z" | python | "2022-01-24T00:33:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,823 | ["STATIC_CODE_CHECKS.rst"] | Static check docs mistakes in example | ### Describe the issue with documentation
<img width="1048" alt="Screenshot 2022-01-12 at 4 22 44 PM" src="https://user-images.githubusercontent.com/10162465/149127114-5810f86e-83eb-40f6-b438-5b18b7026e86.png">
Run the flake8 check for the tests.core package with verbose output:
./breeze static-check mypy -- --files tests/hooks/test_druid_hook.py
The doc says flake8 check for tests.core package but it runs mypy check for files
### How to solve the problem
It can be solved by adding the right instruction.
./breeze static-check flake8 -- --files tests/core/* --verbose
Didn't check if the above command is working. But we have to use similar command like above.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20823 | https://github.com/apache/airflow/pull/20844 | 8dc68d47048d559cf4b76874d8d5e7a5af6359b6 | c49d6ec8b67e48d0c0fba1fe30c00f590f88ae65 | "2022-01-12T11:14:33Z" | python | "2022-01-13T07:30:46Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,804 | ["airflow/dag_processing/manager.py", "airflow/sensors/smart_sensor.py", "tests/dag_processing/test_manager.py", "tests/sensors/test_smart_sensor_operator.py"] | Some timing metrics are in seconds but reported as milliseconds | ### Apache Airflow version
2.2.2
### What happened
When Airflow reports timing stats it uses either a `timedelta` or a direct value. When using `timedelta` it is converted automatically to the correct units of measurement but when using a direct value it is accepted to already be in the correct units.
Unfortunately the Stats class, either being statsd.StatsClient or a stub, expects *milliseconds* while the Airflow code passes the value in *seconds*.
The result is two of the timing metrics are wrong by a magnitude of 1000.
This affects `dag_processing.last_duration.<dag_file>` and `smart_sensor_operator.loop_duration`.
The rest either pass `timedelta` or use a `Stats.timer` which calculates timing on its own and is not affected.
### What you expected to happen
All timing metrics to be in the correct unit of measurement.
### How to reproduce
Run a statsd-exporter and a prometheus to collect the metrics and compare to the logs.
For the dag processing metric, the scheduler logs the amounts and can be directly compared to the gathered metric.
### Operating System
Linux
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
Using these two with the configs below to process the metrics. The metrics can be viewed in the prometheus UI on `localhost:9090`.
```
prometheus:
image: prom/prometheus:v2.32.1
command:
- --config.file=/etc/prometheus/config.yml
- --web.console.libraries=/usr/share/prometheus/console_libraries
- --web.console.templates=/usr/share/prometheus/consoles
ports:
- 9090:9090
volumes:
- ./prometheus:/etc/prometheus
statsd-exporter:
image: prom/statsd-exporter:v0.22.4
command:
- --statsd.mapping-config=/tmp/statsd_mapping.yml
ports:
- 9102:9102
- 9125:9125
- 9125:9125/udp
volumes:
- ./prometheus/statsd_mapping.yml:/tmp/statsd_mapping.yml
```
The prometheus config is:
```
global:
scrape_interval: 15s
scrape_configs:
- job_name: airflow_statsd
scrape_interval: 1m
scrape_timeout: 30s
static_configs:
- targets:
- statsd-exporter:9102
```
The metrics mapping for statsd-exporter is:
```
mappings:
- match: "airflow.dag_processing.last_duration.*"
name: "airflow_dag_processing_last_duration"
labels:
dag_file: "$1"
- match: "airflow.collect_db_tags"
name: "airflow_collect_db_tags"
labels: {}
- match: "airflow.scheduler.critical_section_duration"
name: "airflow_scheduler_critical_section_duration"
labels: {}
- match: "airflow.dagrun.schedule_delay.*"
name: "airflow_dagrun_schedule_delay"
labels:
dag_id: "$1"
- match: "airflow.dag_processing.total_parse_time"
name: "airflow_dag_processing_total_parse_time"
labels: {}
- match: "airflow.dag_processing.last_run.seconds_ago.*"
name: "airflow_dag_processing_last_run_seconds_ago"
labels:
dag_file: "$1"
```
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20804 | https://github.com/apache/airflow/pull/21106 | 6e96f04eb515149f185448b8dfb84813c5879fc0 | 1507ca48d7c211799129ce7956c11f4c45fee5bc | "2022-01-11T09:44:40Z" | python | "2022-06-01T04:40:36Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,746 | ["airflow/providers/amazon/aws/hooks/emr.py"] | Airflow Elastic MapReduce connection | ### Description
Airflow has connections:
```
Amazon Web Services
Amazon Redshift
Elastic MapReduce
```
It wasn't easy to find the `Elastic MapReduce` because it wasn't listed as Amazon.
I think it would be better to rename it to `Amazon Elastic MapReduce` for easier find in the Connections droplist
### Use case/motivation
_No response_
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20746 | https://github.com/apache/airflow/pull/20767 | da9210e89c618611b1e450617277b738ce92ffd7 | 88e3f2ae5e5101928858099f9d4e7fb6542c4110 | "2022-01-07T12:37:23Z" | python | "2022-01-08T15:26:08Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,740 | [".pre-commit-config.yaml", "dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/pre_commit_ids.py", "dev/breeze/src/airflow_breeze/pre_commit_ids_TEMPLATE.py.jinja2", "dev/breeze/src/airflow_breeze/utils/run_utils.py", "dev/breeze/tests/test_cache.py", "scripts/ci/pre_commit/pre_commit_check_pre_commit_hook_names.py"] | Breeze: Running static checks with Breeze | We should rewrite the action to run static checks with Breeze. Currently implemented by 'static-checks' opetoin.
The should baiscally run appropriate `pre-commit run` statement. The difference vs. running just pre-commit is that it should implement auto-complete of checks available (pre-commit does not have it) and make sure that CI image is built for the checks that require it. The list of available static checks should be retrieved by parsing the .pre-commit.yml file rather than (as it is currently done) maintaining the list in ./breeze-complete script
More info on static checks: https://github.com/apache/airflow/blob/main/STATIC_CODE_CHECKS.rst
| https://github.com/apache/airflow/issues/20740 | https://github.com/apache/airflow/pull/20848 | 82adce535eb0c427c230035d648bf3c829824b21 | 684fe46158aa3d6cb2de245d29e20e487d8f2158 | "2022-01-07T08:54:44Z" | python | "2022-01-27T17:34:44Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,739 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/cache.py", "dev/breeze/src/airflow_breeze/docs_generator/__init__.py", "dev/breeze/src/airflow_breeze/docs_generator/build_documentation.py", "dev/breeze/src/airflow_breeze/docs_generator/doc_builder.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/utils/docker_command_utils.py", "dev/breeze/tests/test_commands.py", "dev/breeze/tests/test_global_constants.py"] | Breeze: Build documentation with Breeze | The "build-documentation" action should be rewritten in python in the new Breeze2 command.
It should allow for the same parameters that are already used by the https://github.com/apache/airflow/blob/main/docs/build_docs.py script: `--package-filter` for example.
It shoudl basically run the ./build_docs.py using CI image.
Also we should add `airflow-start-doc-server` as a separate entrypoint (similar to airflow-freespace). | https://github.com/apache/airflow/issues/20739 | https://github.com/apache/airflow/pull/20886 | 793684a88ce3815568c585c45eb85a74fa5b2d63 | 81c85a09d944021989ab530bc6caf1d9091a753c | "2022-01-07T08:49:46Z" | python | "2022-01-24T10:45:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,655 | ["airflow/www/views.py", "tests/www/views/test_views_tasks.py"] | Edit Task Instance Page does not save updates. | ### Apache Airflow version
main (development)
### What happened
On the Task Instances table, one can click on `Edit Record` and be directed to a `Edit Task Instance` page. There, if you try to change the state, it will simply not update nor are there any errors in the browser console.
<img width="608" alt="Screen Shot 2022-01-04 at 10 10 52 AM" src="https://user-images.githubusercontent.com/4600967/148101472-fb6076fc-3fc3-44c4-a022-6625d5112551.png">
### What you expected to happen
Changes to state should actually update the task instance.
### How to reproduce
Try to use the `Edit Task Instance` page
### Operating System
Mac OSx
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
This page is redundant, as a user can successfully edit task instance states from the table. Instead of wiring up the correct actions, I say we should just remove the `Edit Task Instance` page entirely.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20655 | https://github.com/apache/airflow/pull/30415 | 22bef613678e003dde9128ac05e6c45ce934a50c | b140c4473335e4e157ff2db85148dd120c0ed893 | "2022-01-04T17:37:21Z" | python | "2023-04-22T17:10:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,632 | ["chart/templates/flower/flower-deployment.yaml", "chart/templates/pgbouncer/pgbouncer-deployment.yaml", "chart/templates/scheduler/scheduler-deployment.yaml", "chart/templates/statsd/statsd-deployment.yaml", "chart/templates/triggerer/triggerer-deployment.yaml", "chart/templates/webserver/webserver-deployment.yaml", "chart/templates/workers/worker-deployment.yaml", "chart/tests/test_airflow_common.py", "chart/values.schema.json", "chart/values.yaml"] | Add priorityClassName support | ### Official Helm Chart version
1.2.0
### Apache Airflow version
2.1.4
### Kubernetes Version
1.21.2
### Helm Chart configuration
_No response_
### Docker Image customisations
_No response_
### What happened
_No response_
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
It's currently impossible to assign a `priorityClassName` to the Airflow containers.
Seems like a very useful feature to ensure that the Airflow infrastructure has higher priority than "regular" pods.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20632 | https://github.com/apache/airflow/pull/20794 | ab762a5a8ae147ae33500ee3c7e7a73d25d03ad7 | ec41fd51e07ca2b9a66e0b99b730300c80a6d059 | "2022-01-03T14:06:15Z" | python | "2022-02-04T19:51:36Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,627 | ["setup.cfg"] | Bump flask-appbuilder to >=3.3.4, <4.0.0 | ### Description
Reason: Improper Authentication in Flask-AppBuilder: https://github.com/advisories/GHSA-m3rf-7m4w-r66q
### Use case/motivation
_No response_
### Related issues
https://github.com/advisories/GHSA-m3rf-7m4w-r66q
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20627 | https://github.com/apache/airflow/pull/20628 | e1fbfc60d29af3fc0928b904ac80ca3b71a3a839 | 97261c642cbf07db91d252cf6b0b7ff184cd64c6 | "2022-01-03T11:50:47Z" | python | "2022-01-03T14:29:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,580 | ["airflow/api/common/experimental/mark_tasks.py"] | Triggers are not terminated when DAG is mark failed | ### Apache Airflow version
2.2.3 (latest released)
### What happened
Hello, I quite excited to new [Deferrable ("Async") Operators in AIP-40 ](https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=177050929), and going to adapt this by develop new async version of ExternalTaskSensor. But while doing, I find out that Deferrable ("Async") Operators are not canceled when DAG is mark failded/success.

When I mark DAG as failed, `wait_task` is canceled (changed to `failed` state), but `wait_task_async` still in `deferred` state and triggerer is keep poking.


### What you expected to happen
Deferrable ("Async") Operators should be canceled as sync version of operators
### How to reproduce
Testing DAG.
```python
from datetime import datetime, timedelta
from airflow import DAG
from airflow.operators.bash import BashOperator
from airflow.sensors.external_task import ExternalTaskSensor, ExternalTaskSensorAsync
with DAG(
'tutorial_async_sensor',
default_args={
'depends_on_past': False,
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
},
description='A simple tutorial DAG using external sensor async',
schedule_interval=timedelta(days=1),
start_date=datetime(2021, 1, 1),
catchup=False,
tags=['example'],
) as dag:
t1 = ExternalTaskSensorAsync(
task_id='wait_task_async',
external_dag_id="tutorial",
external_task_id="sleep",
execution_delta=timedelta(hours=1),
poke_interval=5.0
)
t2 = ExternalTaskSensor(
task_id='wait_task',
external_dag_id="tutorial",
external_task_id="sleep",
execution_delta=timedelta(hours=1),
poke_interval=5.0
)
t3 = BashOperator(
task_id='echo',
depends_on_past=False,
bash_command='echo Hello world',
retries=3,
)
[t1, t2] >> t3
```
for `ExternalTaskSensorAsync`, see #20583
### Operating System
Ubuntu 20.04
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
run `airflow standalone`
### Anything else
_No response_
### Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20580 | https://github.com/apache/airflow/pull/20649 | b83084b1b05415079972a76e3e535d40a1998de8 | 64c0bd50155dfdb84671ac35d645b812fafa78a1 | "2021-12-30T07:29:20Z" | python | "2022-01-05T07:42:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,579 | ["airflow/cli/commands/task_command.py", "tests/cli/commands/test_task_command.py"] | Airflow 2.2.3 : "airflow dags trigger" command gets "Calling `DAG.create_dagrun()` without an explicit data interval is deprecated" | ### Apache Airflow version
2.2.3 (latest released)
### What happened
When issuing the following command:
`airflow dags trigger 'VMWARE_BACKUP' --conf '{"VM_NAME":"psfiplb1"}'`
the system replays with:
```
/usr/local/lib/python3.8/dist-packages/airflow/api/common/experimental/trigger_dag.py:85 DeprecationWarning: Calling `DAG.create_dagrun()` without an explicit data interval is deprecated
Created <DagRun VMWARE_BACKUP @ 2021-12-31T00:00:00+00:00: manual__2021-12-31T00:00:00+00:00, externally triggered: True>
```
### What you expected to happen
I have no other switch parameter to choose for avoiding the deprecation warning.
My concern is about what will happen to the command when the deprecation will transforms itself in an error.
### How to reproduce
_No response_
### Operating System
Ubuntu 20.04.3 LTS
### Versions of Apache Airflow Providers
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-mssql==2.0.1
apache-airflow-providers-microsoft-winrm==2.0.1
apache-airflow-providers-openfaas==2.0.0
apache-airflow-providers-oracle==2.0.1
apache-airflow-providers-samba==3.0.1
apache-airflow-providers-sftp==2.3.0
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.3.0
### Deployment
Virtualenv installation
### Deployment details
Airflow inside an LXD continer
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20579 | https://github.com/apache/airflow/pull/27106 | c3095d77b81c1a3a6510246b1fea61e6423d518b | 70680ded7a4056882008b019f5d1a8f559a301cd | "2021-12-30T05:39:01Z" | python | "2023-03-16T19:08:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,560 | ["chart/templates/secrets/elasticsearch-secret.yaml", "chart/tests/test_elasticsearch_secret.py"] | Elasticsearch connection user and password should be optional | ### Official Helm Chart version
main (development)
### Apache Airflow version
2.2.3 (latest released)
### Kubernetes Version
1.21.5
### Helm Chart configuration
```yaml
elasticsearch:
enabled: true
connection:
host: elasticsearch-master
port: 9200
```
### Docker Image customisations
_No response_
### What happened
Secret was created with this connection:
```
http://%3Cno+value%3E:%3Cno+value%3E@elasticsearch-master:9200
```
### What you expected to happen
Secret created with this connection:
```
http://elasticsearch-master:9200
```
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20560 | https://github.com/apache/airflow/pull/21222 | 5a6a2d604979cb70c5c9d3797738f0876dd38c3b | 5dc6338346deca8e5a9a47df2da19f38eeac0ce8 | "2021-12-29T21:57:46Z" | python | "2022-02-11T04:30:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,552 | ["airflow/providers/google/cloud/transfers/sftp_to_gcs.py", "tests/providers/google/cloud/transfers/test_sftp_to_gcs.py"] | gzip parameter of sftp_to_gcs operator is never passed to the GCS hook | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
List of versions is the one provided by Cloud Composer image `composer-1.17.7-airflow-2.1.4`.
(cf. https://cloud.google.com/composer/docs/concepts/versioning/composer-versions#images )
Relevant here is `apache-airflow-providers-google==5.1.0`
### Apache Airflow version
2.1.4
### Deployment
Composer
### Deployment details
_No response_
### What happened
Found on version 2.1.4 and reproduced locally on main branch (v2.2.3).
When using `SFTPToGCSOperator` with `gzip=True`, no compression is actually performed, the files are copied/moved as-is to GCS.
This happens because the `gzip` parameter isn't passed to the GCS Hook `upload()` call which then defaults to `False`.
### What you expected to happen
I expect the files to be compressed when `gzip=True`.
### How to reproduce
Create any `SFTPToGCSOperator` with `gzip=True` and upload a file.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20552 | https://github.com/apache/airflow/pull/20553 | c5c18c54fa83463bc953249dc28edcbf7179da17 | 3a480f5ff41c2da4ae4fd6b2289e064ee42048a5 | "2021-12-29T10:51:58Z" | python | "2021-12-29T12:20:18Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,545 | ["docs/apache-airflow/templates-ref.rst"] | built in macros (macros.random, macros.time) need documentation change | ### Apache Airflow version
2.2.3 (latest released)
### What happened
My gut says that the way forward is to change the macros object so that it only exposes modules:
- datetime
- time
- uuid
- random
... and then leave it to the user to decide which functions on those modules they want to call. I'm not confident enough to make that change. If instead we want to change the docs to match the actual functionality, I can submit a PR for that.
### What you expected to happen
When using either the bultin macros time or random they don't call datetime.time or random they instead call the builtin module time or for random returns a function instead of the module.
### How to reproduce
```python
import datetime as dt
import time
from uuid import uuid4
from textwrap import dedent
from airflow.models import DAG
from airflow.operators.python import PythonOperator
from dateutil.parser import parse as dateutil_parse
"""
According to the docs:
macros.datetime - datetime.datetime
macros.timedelta - datetime.timedelta
macros.datetutil - dateutil package
macros.time - datetime.time
macros.uuid - python standard lib uuid
macros.random - python standard lib random
According to the code:
macros.datetime - datetime.datetime
macros.timedelta - datetime.timedelta
macros.datetutil - dateutil package
macros.time - python standard lib time <--- differs
macros.uuid - python standard lib uuid
macros.random - random.random <--- differs
"""
def date_time(datetime_obj):
compare_obj = dt.datetime(2021, 12, 12, 8, 32, 23)
assert datetime_obj == compare_obj
def time_delta(timedelta_obj):
compare_obj = dt.timedelta(days=3, hours=4)
assert timedelta_obj == compare_obj
def date_util(dateutil_obj):
compare_obj = dateutil_parse("Thu Sep 26 10:36:28 2019")
assert dateutil_obj == compare_obj
def time_tester(time_obj):
# note that datetime.time.time() gives an AttributeError
# time.time() on the other hand, returns a float
# this works because macro.time isn't 'datetime.time', like the docs say
# it's just 'time'
compare_obj = time.time()
print(time_obj)
print(compare_obj)
# the macro might have captured a slightly differnt time than the task,
# but they're not going to be more than 10s apart
assert abs(time_obj - compare_obj) < 10
def uuid_tester(uuid_obj):
compare_obj = uuid4()
assert len(str(uuid_obj)) == len(str(compare_obj))
def random_tester(random_float):
# note that 'random.random' is a function that returns a float
# while 'random' is a module (and isn't callable)
# the macro was 'macros.random()' and here we have a float:
assert -0.1 < random_float < 100.1
# so the docs are wrong here too
# macros.random actually returns a function, not the random module
def show_docs(attr):
print(attr.__doc__)
with DAG(
dag_id="builtin_macros_with_airflow_specials",
schedule_interval=None,
start_date=dt.datetime(1970, 1, 1),
render_template_as_native_obj=True, # render templates using Jinja NativeEnvironment
tags=["core"],
) as dag:
test_functions = {
"datetime": (date_time, "{{ macros.datetime(2021, 12, 12, 8, 32, 23) }}"),
"timedelta": (time_delta, "{{ macros.timedelta(days=3, hours=4) }}"),
"dateutil": (
date_util,
"{{ macros.dateutil.parser.parse('Thu Sep 26 10:36:28 2019') }}",
),
"time": (time_tester, "{{ macros.time.time() }}"),
"uuid": (uuid_tester, "{{ macros.uuid.uuid4() }}"),
"random": (
random_tester,
"{{ 100 * macros.random() }}",
),
}
for name, (func, template) in test_functions.items():
(
PythonOperator(
task_id=f"showdoc_{name}",
python_callable=show_docs,
op_args=[f"{{{{ macros.{name} }}}}"],
)
>> PythonOperator(
task_id=f"test_{name}", python_callable=func, op_args=[template]
)
)
```
### Operating System
Docker (debian:buster)
### Versions of Apache Airflow Providers
2.2.2, and 2.2.3
### Deployment
Other
### Deployment details
Astro CLI with sequential executor
### Anything else
Rather than changing the docs to describe what the code actually does would it be better to make the code behave in a way that is more consistent (e.g. toplevel modules only)?
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20545 | https://github.com/apache/airflow/pull/20637 | b1b8f304586e8ad181861dfe8ac15297c78f917b | 8b2299b284ac15900f54bf8c84976cc01f4d597c | "2021-12-29T01:12:01Z" | python | "2022-01-04T03:39:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,534 | ["airflow/www/views.py"] | is not bound to a Session; lazy load operation of attribute 'dag_model' | ### Apache Airflow version
2.2.3 (latest released)
### What happened
An error occurred while browsing the page (/rendered-k8s?dag_id=xxx&task_id=yyy&execution_date=zzz):
```
Parent instance <TaskInstance at 0x7fb0b01df940> is not bound to a Session; lazy load operation of attribute 'dag_model' cannot proceed (Background on this error at: http://sqlalche.me/e/13/bhk3) Traceback (most recent call last):
args=self.command_as_list(),
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 527, in command_as_list
dag = self.dag_model
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/orm/attributes.py", line 294, in __get__
return self.impl.get(instance_state(instance), dict_)
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/orm/attributes.py", line 730, in get
value = self.callable_(state, passive)
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/orm/strategies.py", line 717, in _load_for_state
raise orm_exc.DetachedInstanceError(
sqlalchemy.orm.exc.DetachedInstanceError: Parent instance <TaskInstance at 0x7fb0b01df940> is not bound to a Session; lazy load operation of attribute 'dag_model' cannot proceed (Background on this error at: http://sqlalche.me/e/13/bhk3)
```
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Operating System
NAME="CentOS Stream" VERSION="8"
### Versions of Apache Airflow Providers
```
apache-airflow-providers-cncf-kubernetes==2.2.0
apache-airflow-providers-elasticsearch==2.0.3
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.3.0
apache-airflow-providers-sftp==2.1.1
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.2.0
```
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20534 | https://github.com/apache/airflow/pull/21006 | 372849486cd455a4ff4821b01805a442f1a78417 | a665f48b606065977e0d3952bc74635ce11726d1 | "2021-12-28T09:11:55Z" | python | "2022-01-21T13:44:40Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,471 | ["airflow/models/dag.py", "airflow/models/dagrun.py", "tests/models/test_dag.py", "tests/models/test_dagrun.py", "tests/models/test_taskinstance.py"] | no logs returned when an end_date specified at the task level is after dag's end date | ### Apache Airflow version
2.2.2
### What happened
When you set an 'end_date' that is after the date for the dag parameter 'end_date' the task fails and no logs are given to the user.
### What you expected to happen
I expected the task to succeed with an 'end_date' of the date specified at the task level.
### How to reproduce
```
from airflow.models import DAG
from airflow.operators.python import PythonOperator
from datetime import datetime
def this_passes():
pass
with DAG(
dag_id="end_date",
schedule_interval=None,
start_date=datetime(2021, 1, 1),
end_date=datetime(2021, 1, 2),
tags=["dagparams"],
) as dag:
t1 = PythonOperator(
task_id="passer",
python_callable=this_passes,
end_date=datetime(2021, 2, 1),
)
```
### Operating System
Docker (debian:buster)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
Using the astro cli with docker image:
quay.io/astronomer/ap-airflow:2.2.2-buster-onbuild
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20471 | https://github.com/apache/airflow/pull/20920 | f612a2f56add4751e959625c49368d09a2a47d55 | 85871eba420f3324432f55f74fe57005ff47a21c | "2021-12-22T20:54:12Z" | python | "2022-03-27T19:11:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,457 | ["airflow/providers/amazon/aws/hooks/base_aws.py", "airflow/providers/amazon/aws/hooks/glue.py", "airflow/providers/amazon/aws/hooks/s3.py"] | Release 2.5.1 of Amazon Provider | ### Apache Airflow version
2.2.3 (latest released)
### What happened
The 2.5.0 version of Amazon Provider contains breaking changes - https://airflow.apache.org/docs/apache-airflow-providers-amazon/stable/commits.html specifically https://github.com/apache/airflow/commit/83b51e53062dc596a630edd4bd01407a556f1aa6 or the combination of the following:
- https://github.com/apache/airflow/commit/83b51e53062dc596a630edd4bd01407a556f1aa6
- https://github.com/apache/airflow/commit/d58df468c8d77c5d45e80f2333eb074bb7771a95
- https://github.com/apache/airflow/commit/4be04143a5f7e246127e942bf1d73abcd22ce189
and confirmed by @uranusjr
I have yanked 2.5.0 and we will need to release ~3.0.0~ 2.5.1 with backwards compatibility fixes
I have updated the constraints for 2.2.3 for now - https://github.com/apache/airflow/commit/62d490d4da17e35d4ddcd4ee38902a8a4e9bbfff
UPDATED: (@potiuk) to reflect that we are going to release 2.5.1 instead of 3.0.0
| https://github.com/apache/airflow/issues/20457 | https://github.com/apache/airflow/pull/20463 | 81f92d6c321992905d239bb9e8556720218fe745 | 2ab2ae8849bf6d80a700b1b74cef37eb187161ad | "2021-12-21T22:59:01Z" | python | "2021-12-22T16:52:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,419 | ["docs/apache-airflow/howto/email-config.rst", "docs/apache-airflow/img/email_connection.png"] | Need clearer documentation on setting up Sendgrid to send emails | ### Describe the issue with documentation
https://airflow.apache.org/docs/apache-airflow/stable/howto/email-config.html
needs to have more documentation on how to use sendgrid to send emails. Like would installing the provider package guarantee the appearance of the "Email" connection type in the frontend? I do not think we can find a Email connection. Secondly do we need to import the providers packages in our dags to make it work or options like email_on_success etc should be automatically working once this is set up.
### How to solve the problem
Better documentation
### Anything else
Better documentation.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20419 | https://github.com/apache/airflow/pull/21958 | cc4b05654e9c8b6d1b3185c5690da87a29b66a4b | c9297808579c0d4f93acfd6791172193be19721b | "2021-12-20T11:18:14Z" | python | "2022-03-03T12:07:21Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,373 | ["airflow/www/static/js/callModal.js", "airflow/www/static/js/dag.js", "airflow/www/static/js/gantt.js", "airflow/www/static/js/graph.js"] | Paused event is logging twice in some pages. | ### Apache Airflow version
2.2.2 (latest released)
### What happened
If I pause/unpause dag in `Tree`, `Graph`, or `Gantt` page, `/paused` url is calling twice.

So paused events are logging twice.

### What you expected to happen
It should be logging only once.
### How to reproduce
Go to `Tree`, `Graph`, or `Gantt` page. And pause/unpause it.
### Operating System
centos
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other 3rd-party Helm chart
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20373 | https://github.com/apache/airflow/pull/28410 | 9c3734bb127ff0d71a0321d0578e556552cfc934 | 2f0f02536f7773dd782bd980ae932091b7badc61 | "2021-12-17T08:59:14Z" | python | "2022-12-22T13:19:40Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,323 | ["airflow/decorators/__init__.py", "airflow/decorators/__init__.pyi", "airflow/decorators/sensor.py", "airflow/example_dags/example_sensor_decorator.py", "airflow/sensors/python.py", "docs/apache-airflow/tutorial/taskflow.rst", "tests/decorators/test_sensor.py"] | Add a `@task.sensor` TaskFlow decorator | ### Description
Implement a taskflow decorator that uses the decorated function as the poke method.
### Use case/motivation
Here is a sketch of the solution that might work:
[sensor_decorator.txt](https://github.com/apache/airflow/files/7721589/sensor_decorator.txt)
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20323 | https://github.com/apache/airflow/pull/22562 | a50195d617ca7c85d56b1c138f46451bc7599618 | cfd63df786e0c40723968cb8078f808ca9d39688 | "2021-12-15T17:37:59Z" | python | "2022-11-07T02:06:19Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,320 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/www/static/js/types/api-generated.ts", "tests/api_connexion/endpoints/test_user_endpoint.py"] | Airflow API throwing 500 when user does not have last name | ### Apache Airflow version
2.2.2 (latest released)
### What happened
When listing users via GET api/v1/users, a 500 error is thrown if a user in the database does not have a last name.
```
{
"detail": "'' is too short\n\nFailed validating 'minLength' in schema['allOf'][0]['properties']['users']['items']['properties']['last_name']:\n {'description': 'The user lastname', 'minLength': 1, 'type': 'string'}\n\nOn instance['users'][25]['last_name']:\n ''",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.apache.org/docs/apache-airflow/2.2.2/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
### What you expected to happen
Result set should still be returned with a null or empty value for the last name instead of a 500 error.
### How to reproduce
Create a user in the database without a last name and then hit GET api/v1/users
### Operating System
linux
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20320 | https://github.com/apache/airflow/pull/25476 | 98f16aa7f3b577022791494e13b6aa7057afde9d | 3421ecc21bafaf355be5b79ec4ed19768e53275a | "2021-12-15T17:15:42Z" | python | "2022-08-02T21:06:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,259 | ["Dockerfile", "Dockerfile.ci", "breeze", "dev/README_RELEASE_PROVIDER_PACKAGES.md", "docs/docker-stack/build-arg-ref.rst", "scripts/ci/libraries/_build_images.sh", "scripts/ci/libraries/_initialization.sh", "scripts/docker/compile_www_assets.sh", "scripts/docker/install_airflow.sh", "scripts/docker/prepare_node_modules.sh"] | Make sure that stderr is "clean" while building the images. | Our image generates a lot of "Noise" during building, which makes "real" errors difficult to distinguish from "false negatives".
We should make sure that there are no warnings while the docker image is build.
If there are any warnings that we cannot remove there, we should add a reassuring note that we know what we are doing. | https://github.com/apache/airflow/issues/20259 | https://github.com/apache/airflow/pull/20238 | 5980d2b05eee484256c634d5efae9410265c65e9 | 4620770af4550251b5139bb99185656227335f67 | "2021-12-13T15:44:16Z" | python | "2022-01-11T09:38:34Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,252 | ["dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/cache.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/visuals/__init__.py"] | Breeze: Asciiart disabling | This is a small thing, but might be useful. Currently Breeze prints the Asciiart when started. You should be able to persistently disable the asciiart.
This is done with storing the rigth "flag" file in the ".build" directory. If the file is there, the ASCIIART should not be printed. If not, it shoudl be printed. | https://github.com/apache/airflow/issues/20252 | https://github.com/apache/airflow/pull/20645 | c9023fad4287213e4d3d77f4c66799c762bff7ba | 8cc93c4bc6ae5a99688ca2effa661d6a3e24f56f | "2021-12-13T09:04:20Z" | python | "2022-01-11T18:16:22Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,251 | ["dev/breeze/doc/BREEZE2.md", "scripts/ci/libraries/_docker_engine_resources.sh", "scripts/in_container/run_resource_check.py", "scripts/in_container/run_resource_check.sh"] | Breeze: Verify if there are enough resources avaiilable in Breeze | At entry of the Breeze command we verify if there iss enouggh CPU/memory/disk space and print (coloured) information if the resources are not enough.
We should replicate that in Python | https://github.com/apache/airflow/issues/20251 | https://github.com/apache/airflow/pull/20763 | b8526abc2c220b1e07eed83694dfee972c2e2609 | 75755d7f65fb06c6e2e74f805b877774bfa7fcda | "2021-12-13T09:01:19Z" | python | "2022-01-19T11:51:51Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,249 | ["airflow/jobs/base_job.py", "airflow/migrations/versions/587bdf053233_adding_index_for_dag_id_in_job.py", "docs/apache-airflow/migrations-ref.rst"] | DAG deletion is slow due to lack of database indexes on dag_id | ### Apache Airflow version
2.2.1
### What happened
We have an airflow instance for approximately 6k DAGs.
- If we delete a DAG from UI, the UI times out
- If we delete a DAG from CLI, it completes but sometimes takes up to a half-hour to finish.
Most of the execution time appears to be consumed in database queries. I know I can just throw more CPU and memory to the db instance and hope it works but I think we can do better during delete operation. Correct me if I am wrong but I think this is the code that gets executed when deleting a DAG from UI or CLI via `delete_dag.py`
```python
for model in models.base.Base._decl_class_registry.values():
if hasattr(model, "dag_id"):
if keep_records_in_log and model.__name__ == 'Log':
continue
cond = or_(model.dag_id == dag_id, model.dag_id.like(dag_id + ".%"))
count += session.query(model).filter(cond).delete(synchronize_session='fetch')
if dag.is_subdag:
parent_dag_id, task_id = dag_id.rsplit(".", 1)
for model in TaskFail, models.TaskInstance:
count += (
session.query(model).filter(model.dag_id == parent_dag_id, model.task_id == task_id).delete()
)
```
I see we are iterating over all the models and doing a `dag_id` match. Some of the tables don't have an index over `dag_id` column like `job` which is making this operation really slow. This could be one easy fix for this issue.
For example, the following query took 20 mins to finish in 16cpu 32gb Postgres instance:
```sql
SELECT job.id AS job_id FROM job WHERE job.dag_id = $1 OR job.dag_id LIKE $2
```
and explain is as follows
```sql
EXPLAIN SELECT job.id AS job_id FROM job WHERE job.dag_id = '';
QUERY PLAN
---------------------------------------------------------------------------
Gather (cost=1000.00..1799110.10 rows=6351 width=8)
Workers Planned: 2
-> Parallel Seq Scan on job (cost=0.00..1797475.00 rows=2646 width=8)
Filter: ((dag_id)::text = ''::text)
(4 rows)
```
This is just one of the many queries that are being executed during the delete operation.
### What you expected to happen
Deletion of DAG should not take this much time.
### How to reproduce
_No response_
### Operating System
nix
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20249 | https://github.com/apache/airflow/pull/20282 | 6d25d63679085279ca1672c2eee2c45d6704efaa | ac9f29da200c208bb52d412186c5a1b936eb0b5a | "2021-12-13T07:31:08Z" | python | "2021-12-30T10:26:24Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,215 | ["airflow/providers/amazon/aws/example_dags/example_emr_serverless.py", "airflow/providers/amazon/aws/hooks/emr.py", "airflow/providers/amazon/aws/operators/emr.py", "airflow/providers/amazon/aws/sensors/emr.py", "airflow/providers/amazon/provider.yaml", "docs/apache-airflow-providers-amazon/operators/emr_serverless.rst", "tests/providers/amazon/aws/hooks/test_emr_serverless.py", "tests/providers/amazon/aws/operators/test_emr_serverless.py"] | EMR serverless, new operator | ### Description
A new EMR serverless has been announced and it is already available, see:
- https://aws.amazon.com/blogs/big-data/announcing-amazon-emr-serverless-preview-run-big-data-applications-without-managing-servers/
- https://aws.amazon.com/emr/serverless/
Having an operator for creating applications and submitting jobs to EMR serverless would be awesome.
### Use case/motivation
New operator for working with EMR serverless.
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20215 | https://github.com/apache/airflow/pull/25324 | 5480b4ca499cfe37677ac1ae1298a2737a78115d | 8df84e99b7319740990124736d0fc545165e7114 | "2021-12-11T01:40:22Z" | python | "2022-08-05T16:54:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,197 | ["airflow/www/templates/airflow/dag.html", "tests/www/views/test_views_tasks.py"] | Triggering a DAG in Graph view switches to Tree view | ### Apache Airflow version
2.2.2 (latest released)
### What happened
After I trigger a DAG from the Graph view, the view switches to the Tree view (happens on both 2.2.2 and main).
### What you expected to happen
I expect to stay in the Graph view, ideally switch to the newly created DAG run
### How to reproduce
Trigger a DAG from the Graph view
### Operating System
N/A
### Versions of Apache Airflow Providers
N/A
### Deployment
Astronomer
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20197 | https://github.com/apache/airflow/pull/20955 | 10f5db863e387c0fd7369cf521d624b6df77a65d | 928dafe6c495bbf3e03d14473753fce915134a46 | "2021-12-10T11:15:26Z" | python | "2022-01-20T08:15:37Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,155 | ["setup.py"] | ssl_cert_reqs has been deprecated in pymongo 3.9 | ### Apache Airflow Provider(s)
mongo
### Versions of Apache Airflow Providers
2.2.0
### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
`ssl_cert_reqs` has been deprecated in pymongo 3.9
https://github.com/apache/airflow/blob/providers-mongo/2.2.0/airflow/providers/mongo/hooks/mongo.py#L94
```
{taskinstance.py:1703} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/taskinstance.py", line 1509, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/transfers/mongo_to_s3.py", line 123, in execute
mongo_db=self.mongo_db,
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/mongo/hooks/mongo.py", line 145, in find
collection = self.get_collection(mongo_collection, mongo_db=mongo_db)
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/mongo/hooks/mongo.py", line 116, in get_collection
mongo_conn: MongoClient = self.get_conn()
File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/mongo/hooks/mongo.py", line 96, in get_conn
self.client = MongoClient(self.uri, **options)
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/mongo_client.py", line 707, in __init__
keyword_opts.cased_key(k), v) for k, v in keyword_opts.items()))
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/mongo_client.py", line 707, in <genexpr>
keyword_opts.cased_key(k), v) for k, v in keyword_opts.items()))
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/common.py", line 740, in validate
value = validator(option, value)
File "/home/airflow/.local/lib/python3.7/site-packages/pymongo/common.py", line 144, in raise_config_error
raise ConfigurationError("Unknown option %s" % (key,))
pymongo.errors.ConfigurationError: Unknown option ssl_cert_reqs
```
Ref:
https://pymongo.readthedocs.io/en/stable/changelog.html#changes-in-version-3-9-0
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20155 | https://github.com/apache/airflow/pull/20511 | 86a249007604307bdf0f69012dbd1b783c8750e5 | f85880e989d7751cfa3ae2d4665d7cc0cb3cc945 | "2021-12-09T08:30:58Z" | python | "2021-12-27T19:28:50Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,110 | ["airflow/www/extensions/init_views.py", "tests/api_connexion/test_cors.py"] | CORS access_control_allow_origin header never returned | ### Apache Airflow version
2.2.2 (latest released)
### What happened
To fix CORS problem added the [access_control_allow_headers](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#access-control-allow-headers), [access_control_allow_methods](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#access-control-allow-methods), [access_control_allow_origins](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#access-control-allow-origins) variables to the 2.2.2 docker-compose file provided in documentation. Both header, and methods returns with the correct value, but origins never does.
### What you expected to happen
The CORS response returning with provided origin header value.
### How to reproduce
Download the latest docker-compose from documentation add the following lines:
`AIRFLOW__API__ACCESS_CONTROL_ALLOW_HEADERS: 'content-type, origin, authorization, accept'`
`AIRFLOW__API__ACCESS_CONTROL_ALLOW_METHODS: 'GET, POST, OPTIONS, DELETE'`
`AIRFLOW__API__ACCESS_CONTROL_ALLOW_ORIGINS: '*'`
run and call with a CORS preflight
### Operating System
Windows 11
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### Anything else
It's repeatable regardless of ORIGINS value. There was a name change on this variable that's possibly not handled.
On 2.1.4 the same works without problems.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20110 | https://github.com/apache/airflow/pull/25553 | 1d8507af07353e5cf29a860314b5ba5caad5cdf3 | e81b27e713e9ef6f7104c7038f0c37cc55d96593 | "2021-12-07T16:36:19Z" | python | "2022-08-05T17:41:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,092 | ["airflow/configuration.py", "docs/apache-airflow/howto/set-config.rst", "tests/core/test_configuration.py"] | PR #18772 breaks `sql_alchemy_conn_cmd` config | ### Apache Airflow version
2.2.1
### What happened
#18772 added two options to `tmp_configuration_copy()` and defaults tasks run as the same user to generate the temp configuration file for the task runner with the options `include_env=False` and `include_cmds=False`. This is a change from the previous defaults which materialized the values of `*_cmd` configs into the JSON dump.
This presents a problem because when using `sql_alchemy_conn_cmd` to set the database connection and running as the same user the temporary config JSON dump now includes both `sql_alchemy_conn`, set to the Airflow distribution default as well as the user set `sql_alchemy_conn_cmd`. And because bare settings take precedence over `_cmd` versions while the Airflow worker can connect to the configured DB the task runner itself will instead use a non-existent SQLite DB causing all tasks to fail.
TLDR; The temp JSON config dump used to look something like:
```
{
"core": {
"sql_alchemy_conn": "mysql://...",
...
}
}
```
Now it looks something like:
```
{
"core": {
"sql_alchemy_conn": "sqlite:////var/opt/airflow/airflow.db",
"sql_alchemy_conn_cmd": "/etc/airflow/get-config 'core/sql_alchemy_conn'"
...
}
}
```
But because `sql_alchemy_conn` is set `sql_alchemy_conn_cmd` never gets called and tasks are unable to access the Airflow DB.
### What you expected to happen
I'm not quite sure what is the preferred method to fix this issue. I guess there are a couple options:
- Remove the bare `sensitive_config_values` if either `_cmd` or `_secret` versions exist
- Ignore the Airflow default value in addition to empty values for bare `sensitive_config_values` during parsing
- Go back to materializing the sensitive configs
### How to reproduce
With either Airflow 2.2.1 or 2.2.2 configure the DB with `sql_alchemy_conn_cmd` and remove the `sql_alchemy_conn` config from your config file. Then run the Airflow worker and task runner as the same user. Try running any task and see that the task tries to access the default SQLite store instead of the configured one.
### Operating System
Debian Bullseye
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20092 | https://github.com/apache/airflow/pull/21539 | e93cd4b2cfa98be70f6521832cfbd4d6b5551e30 | e07bc63ec0e5b679c87de8e8d4cdff1cf4671146 | "2021-12-07T04:31:34Z" | python | "2022-03-15T18:06:50Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,063 | ["airflow/models/dag.py", "airflow/www/views.py"] | Forward slash in `dag_run_id` gives rise to trouble accessing things through the REST API | ### Apache Airflow version
2.1.4
### Operating System
linux
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.2.0
apache-airflow-providers-celery==2.0.0
apache-airflow-providers-cncf-kubernetes==2.0.2
apache-airflow-providers-docker==2.1.1
apache-airflow-providers-elasticsearch==2.0.3
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==5.1.0
apache-airflow-providers-grpc==2.0.1
apache-airflow-providers-hashicorp==2.1.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-microsoft-azure==3.1.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.2.0
apache-airflow-providers-redis==2.0.1
apache-airflow-providers-sendgrid==2.0.1
apache-airflow-providers-sftp==2.1.1
apache-airflow-providers-slack==4.0.1
apache-airflow-providers-sqlite==2.0.1
apache-airflow-providers-ssh==2.1.1
### Deployment
Docker-Compose
### Deployment details
We tend to trigger dag runs by some external event, e.g., a media-file upload, see #19745. It is useful to use the media-file path as a dag run id. The media-id can come with some partial path, e.g., `path/to/mediafile`. All this seems to work fine in airflow, but we can't figure out a way to use the such a dag run id in the REST API, as the forward slashes `/` interfere with the API routing.
### What happened
When using the API route `api/v1/dags/{dag_id}/dagRuns/{dag_run_id}` in, e.g., a HTTP GET, we expect a dag run to be found when `dag_run_id` has the value `path/to/mediafile`, but instead a `.status: 404` is returned. When we change the `dag_run_id` to the format `path|to|mediafile`, the dag run is returned.
### What you expected to happen
We would expect a dag run to be returned, even if it contains the character `/`
### How to reproduce
Trigger a dag using a dag_run_id that contains a `/`, then try to retrieve it though the REST API.
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20063 | https://github.com/apache/airflow/pull/23106 | ebc1f14db3a1b14f2535462e97a6407f48b19f7c | 451c7cbc42a83a180c4362693508ed33dd1d1dab | "2021-12-06T09:00:59Z" | python | "2022-05-03T21:22:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,032 | ["airflow/providers/snowflake/hooks/snowflake.py", "tests/providers/snowflake/hooks/test_snowflake.py"] | Snowflake Provider - Hook's support for not providing a region is broken when using SQLAlchemy | ### Apache Airflow Provider(s)
snowflake
### Versions of Apache Airflow Providers
Versions 2.2.x (since https://github.com/apache/airflow/commit/0a37be3e3cf9289f63f1506bc31db409c2b46738).
### Apache Airflow version
2.2.1
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Other 3rd-party Helm chart
### Deployment details
Bitnami Airflow Helm chart @ version 8.0.2
### What happened
When connecting to Snowflake via SQLAlchemy using the Snowflake Hook, I get an error that the URL is not valid because my Snowflake instance is in US West 2 (Oregon) which means I don't provide a region explicitly. Snowflake's documentation says:
> If the account is located in the AWS US West (Oregon) region, no additional segments are required and the URL would be xy12345.snowflakecomputing.com
The error is that `xy12345..snowflakecomputing.com` is not a valid URL (note the double-dot caused by the lack of a region).
### What you expected to happen
I expect the connection to be successful.
### How to reproduce
You can use the default snowflake connection if you have one defined and see this problem with the following one-liner:
```shell
python -c 'from airflow.providers.snowflake.hooks.snowflake import SnowflakeHook; SnowflakeHook().get_sqlalchemy_engine().connect()'
```
### Anything else
Fortunately I imagine the fix for this is just to leave the region URL component out when `region` is `None` here: https://github.com/apache/airflow/commit/0a37be3e3cf9289f63f1506bc31db409c2b46738#diff-2b674ac999a5b938fe5045f6475b0c5cc76e4cab89174ac448a9e1d41a5c04d5R215.
Using version `2.1.1` of the Snowflake provider with version `2.2.1` of Airflow is currently a viable workaround so for now I am just avoiding the update to the provider.
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20032 | https://github.com/apache/airflow/pull/20509 | b7086f9815d3856cb4f3ee5bbc78657f19df9d2d | a632b74846bae28408fb4c1b38671fae23ca005c | "2021-12-04T00:26:56Z" | python | "2021-12-28T12:54:54Z" |
closed | apache/airflow | https://github.com/apache/airflow | 20,007 | ["airflow/providers/google/cloud/transfers/postgres_to_gcs.py"] | PostgresToGCSOperator fail on empty table and use_server_side_cursor=True | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
apache-airflow-providers-google==6.1.0
### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux 10 (buster)
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
When I'm execute `PostgresToGCSOperator` on empty table and set `use_server_side_cursor=True` the operator fails with error:
```
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py", line 154, in execute
files_to_upload = self._write_local_data_files(cursor)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py", line 213, in _write_local_data_files
row = self.convert_types(schema, col_type_dict, row)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/providers/google/cloud/transfers/sql_to_gcs.py", line 174, in convert_types
return [self.convert_type(value, col_type_dict.get(name)) for name, value in zip(schema, row)]
TypeError: 'NoneType' object is not iterable
```
Operator command when I'm using:
```python
task_send = PostgresToGCSOperator(
task_id=f'send_{table}',
postgres_conn_id='postgres_raw',
gcp_conn_id=gcp_conn_id,
sql=f'SELECT * FROM public.{table}',
use_server_side_cursor=True,
bucket=bucket,
filename=f'{table}.csv',
export_format='csv',
)
```
### What you expected to happen
I'm expected, that operator on empty table not creating file and no upload it on Google Cloud.
### How to reproduce
- Create empty postgresql table.
- Create dag with task with PostgresToGCSOperator. that upload this table in Google Cloud.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/20007 | https://github.com/apache/airflow/pull/21307 | 00f0025abf6500af67f4c5b7543d45658d31b3b2 | 2eb10565b2075d89eb283bd53462c00f5d54ab55 | "2021-12-03T08:17:25Z" | python | "2022-02-15T11:41:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,989 | ["airflow/www/static/js/dag_dependencies.js"] | Arrows in DAG dependencies view are not consistent | ### Apache Airflow version
main (development)
### Operating System
N/A
### Versions of Apache Airflow Providers
N/A
### Deployment
Other
### Deployment details
_No response_
### What happened
The arrows in the DAG dependencies view are not consistent with the graph view:
Graph View:

DAG dependencies view:

### What you expected to happen
All arrows should be filled black
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19989 | https://github.com/apache/airflow/pull/20303 | 2d9338f2b7fa6b7aadb6a81cd5fc3b3ad8302a4a | 28045696dd3ea7207b1162c2343ba142e1f75e5d | "2021-12-02T18:53:08Z" | python | "2021-12-15T03:50:00Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,986 | ["BREEZE.rst", "PULL_REQUEST_WORKFLOW.rst", "README.md", "TESTING.rst", "breeze-complete", "docs/apache-airflow/concepts/scheduler.rst", "docs/apache-airflow/howto/set-up-database.rst", "docs/apache-airflow/installation/prerequisites.rst", "scripts/ci/libraries/_initialization.sh"] | Postgres 9.6 end of support | ### Describe the issue with documentation
since November 11, 2021 the support of Postgres 9.6 is finish
https://www.postgresql.org/support/versioning/
### How to solve the problem
remove 9.6 from airflow
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19986 | https://github.com/apache/airflow/pull/19987 | 538612c3326b5fd0be4f4114f85e6f3063b5d49c | a299cbf4ce95af49132a6c7b17cd6a0355544836 | "2021-12-02T18:32:02Z" | python | "2021-12-05T23:04:01Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,977 | ["airflow/providers/google/cloud/transfers/gdrive_to_gcs.py", "tests/providers/google/cloud/transfers/test_gdrive_to_gcs.py"] | GoogleDriveToGCSOperator.dry_run() raises AttributeError due to deprecated template_fields | ### Apache Airflow Provider(s)
google
### Versions of Apache Airflow Providers
apache-airflow-providers-google==5.1.0
### Apache Airflow version
2.1.4
### Operating System
MacOS BigSur 11.6
### Deployment
Virtualenv installation
### Deployment details
`tox4` virtual environment
### What happened
1) Unit testing some DAG task operators with `asset op.dry_run() is None`
2) `AttributeError` is raised when `baseoperator` iterates through `template_fields` and calls `getattr`
```python
def dry_run(self) -> None:
"""Performs dry run for the operator - just render template fields."""
self.log.info('Dry run')
for field in self.template_fields:
content = getattr(self, field)
```
3) Caused by deprecated `destination_bucket` argument, is being assigned to `self.bucket_name`
```python
template_fields = [
"bucket_name",
"object_name",
"destination_bucket",
"destination_object",
"folder_id",
"file_name",
"drive_id",
"impersonation_chain",
]
def __init__(
self,
*,
bucket_name: Optional[str] = None,
object_name: Optional[str] = None,
destination_bucket: Optional[str] = None, # deprecated
destination_object: Optional[str] = None, # deprecated
file_name: str,
folder_id: str,
drive_id: Optional[str] = None,
gcp_conn_id: str = "google_cloud_default",
delegate_to: Optional[str] = None,
impersonation_chain: Optional[Union[str, Sequence[str]]] = None,
**kwargs,
) -> None:
super().__init__(**kwargs)
self.bucket_name = destination_bucket or bucket_name
if destination_bucket:
warnings.warn(
"`destination_bucket` is deprecated please use `bucket_name`",
DeprecationWarning,
stacklevel=2,
)
self.object_name = destination_object or object_name
if destination_object:
warnings.warn(
"`destination_object` is deprecated please use `object_name`",
DeprecationWarning,
stacklevel=2,
)
self.folder_id = folder_id
self.drive_id = drive_id
self.file_name = file_name
self.gcp_conn_id = gcp_conn_id
self.delegate_to = delegate_to
self.impersonation_chain = impersonation_chain
```
### What you expected to happen
Calling `op.dry_run()` should return None and not raise any exceptions.
The templated fields contains deprecated arguments (`destination_bucket`, `destination_object`), and aren't initialized in the init method for the class.
The base operator loops through these templated fields, but since `GoogleDriveToGCSOperator` does not initialize `self.destination_bucket` or `self.destination_object`, it raises an `AttributeError`
### How to reproduce
```python
from airflow.providers.google.cloud.transfers import gdrive_to_gcs
# pytest fixtures included as arguments
# won't include for brevity, but can provide if necessary
def test_gdrive_to_gcs_transfer(
test_dag,
mock_gcp_default_conn,
patched_log_entry,
today
):
op = gdrive_to_gcs.GoogleDriveToGCSOperator(
task_id="test_gcs_to_gdrive_transfer",
dag=test_dag,
bucket_name="some-other-bucket",
object_name="thing_i_want_to_copy.csv",
file_name="my_file.csv",
folder_id="my_folder",
drive_id="some_drive_id",
)
assert op.dry_run() is None
```
### Anything else
Not sure where it would be appropriate to address this issue since the deprecated fields support backward compatibility to previous versions of the operator.
This is my first time contributing to the project, but hope this is helpful.
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19977 | https://github.com/apache/airflow/pull/19991 | 53b241534576f1a85fe7f87ed66793d43f3a564e | cb082d361a61da7040e044ff2c1f7758142a9b2d | "2021-12-02T15:14:15Z" | python | "2021-12-02T22:25:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,970 | ["dev/breeze/src/airflow_breeze/branch_defaults.py", "dev/breeze/src/airflow_breeze/breeze.py", "dev/breeze/src/airflow_breeze/cache.py", "dev/breeze/src/airflow_breeze/ci/build_image.py", "dev/breeze/src/airflow_breeze/ci/build_params.py", "dev/breeze/src/airflow_breeze/console.py", "dev/breeze/src/airflow_breeze/global_constants.py", "dev/breeze/src/airflow_breeze/utils.py"] | Breeze: Build CI images with Breeze | This quite a task and it should likely be split into smaller PRs until completed.
Breeze has the possibility of building CI and PROD images, but this one focuses only on building the CI image.
Building the image is not very straightforward, because in mulitple places the images are build differently:
* for different Python version
* with different caches
* with different build options (upgrade to newer dependencies for example)
Also the image is multiple steps
* building base python image (if needed)
* building the CI image
The whole story about images (both PROD and CI) is described here:
https://github.com/apache/airflow/blob/main/IMAGES.rst
This change should end up withbuild-ci-image command that should build the image with all the possible flags.
NOTE: The old `breeze build image` command also implements complex logic to determine whether images should be pulled or whether they need to be rebuild at all. This is NOT part of this change and it will be implemented later.
Here are the current flags.
```
./breeze build-image --help
Good version of docker 20.10.7.
Build image
Detailed usage for command: build-image
breeze build-image [FLAGS]
Builds docker image (CI or production) without entering the container. You can pass
additional options to this command, such as:
Choosing python version:
'--python'
Choosing cache option:
'--build-cache-local' or '-build-cache-pulled', or '--build-cache-none'
Choosing whether to force pull images or force build the image:
'--force-build-image', '--force-pull-image'
Checking if the base python image has been updated:
'--check-if-base-python-image-updated'
You can also pass '--production-image' flag to build production image rather than CI image.
For GitHub repository, the '--github-repository' can be used to choose repository
to pull/push images.
Flags:
-p, --python PYTHON_MAJOR_MINOR_VERSION
Python version used for the image. This is always major/minor version.
One of:
3.7 3.8 3.9 3.6
-a, --install-airflow-version INSTALL_AIRFLOW_VERSION
Uses different version of Airflow when building PROD image.
2.0.2 2.0.1 2.0.0 wheel sdist
-t, --install-airflow-reference INSTALL_AIRFLOW_REFERENCE
Installs Airflow directly from reference in GitHub when building PROD image.
This can be a GitHub branch like main or v2-2-test, or a tag like 2.2.0rc1.
--installation-method INSTALLATION_METHOD
Method of installing Airflow in PROD image - either from the sources ('.')
or from package 'apache-airflow' to install from PyPI.
Default in Breeze is to install from sources. One of:
. apache-airflow
--upgrade-to-newer-dependencies
Upgrades PIP packages to latest versions available without looking at the constraints.
-I, --production-image
Use production image for entering the environment and builds (not for tests).
-F, --force-build-images
Forces building of the local docker images. The images are rebuilt
automatically for the first time or when changes are detected in
package-related files, but you can force it using this flag.
-P, --force-pull-images
Forces pulling of images from GitHub Container Registry before building to populate cache.
The images are pulled by default only for the first time you run the
environment, later the locally build images are used as cache.
--check-if-base-python-image-updated
Checks if Python base image from DockerHub has been updated vs the current python base
image we store in GitHub Container Registry. Python images are updated regularly with
security fixes, this switch will check if a new one has been released and will pull and
prepare a new base python based on the latest one.
--cleanup-docker-context-files
Removes whl and tar.gz files created in docker-context-files before running the command.
In case there are some files there it unnecessarily increases the context size and
makes the COPY . always invalidated - if you happen to have those files when you build your
image.
Customization options:
-E, --extras EXTRAS
Extras to pass to build images The default are different for CI and production images:
CI image:
devel_ci
Production image:
amazon,async,celery,cncf.kubernetes,dask,docker,elasticsearch,ftp,google,google_auth,
grpc,hashicorp,http,ldap,microsoft.azure,mysql,odbc,pandas,postgres,redis,sendgrid,
sftp,slack,ssh,statsd,virtualenv
--image-tag TAG
Additional tag in the image.
--skip-installing-airflow-providers-from-sources
By default 'pip install' in Airflow 2.0 installs only the provider packages that
are needed by the extras. When you build image during the development (which is
default in Breeze) all providers are installed by default from sources.
You can disable it by adding this flag but then you have to install providers from
wheel packages via --use-packages-from-dist flag.
--disable-pypi-when-building
Disable installing Airflow from pypi when building. If you use this flag and want
to install Airflow, you have to install it from packages placed in
'docker-context-files' and use --install-from-docker-context-files flag.
--additional-extras ADDITIONAL_EXTRAS
Additional extras to pass to build images The default is no additional extras.
--additional-python-deps ADDITIONAL_PYTHON_DEPS
Additional python dependencies to use when building the images.
--dev-apt-command DEV_APT_COMMAND
The basic command executed before dev apt deps are installed.
--additional-dev-apt-command ADDITIONAL_DEV_APT_COMMAND
Additional command executed before dev apt deps are installed.
--additional-dev-apt-deps ADDITIONAL_DEV_APT_DEPS
Additional apt dev dependencies to use when building the images.
--dev-apt-deps DEV_APT_DEPS
The basic apt dev dependencies to use when building the images.
--additional-dev-apt-deps ADDITIONAL_DEV_DEPS
Additional apt dev dependencies to use when building the images.
--additional-dev-apt-envs ADDITIONAL_DEV_APT_ENVS
Additional environment variables set when adding dev dependencies.
--runtime-apt-command RUNTIME_APT_COMMAND
The basic command executed before runtime apt deps are installed.
--additional-runtime-apt-command ADDITIONAL_RUNTIME_APT_COMMAND
Additional command executed before runtime apt deps are installed.
--runtime-apt-deps ADDITIONAL_RUNTIME_APT_DEPS
The basic apt runtime dependencies to use when building the images.
--additional-runtime-apt-deps ADDITIONAL_RUNTIME_DEPS
Additional apt runtime dependencies to use when building the images.
--additional-runtime-apt-envs ADDITIONAL_RUNTIME_APT_DEPS
Additional environment variables set when adding runtime dependencies.
Build options:
--disable-mysql-client-installation
Disables installation of the mysql client which might be problematic if you are building
image in controlled environment. Only valid for production image.
--disable-mssql-client-installation
Disables installation of the mssql client which might be problematic if you are building
image in controlled environment. Only valid for production image.
--constraints-location
Url to the constraints file. In case of the production image it can also be a path to the
constraint file placed in 'docker-context-files' folder, in which case it has to be
in the form of '/docker-context-files/<NAME_OF_THE_FILE>'
--disable-pip-cache
Disables GitHub PIP cache during the build. Useful if GitHub is not reachable during build.
--install-from-docker-context-files
This flag is used during image building. If it is used additionally to installing
Airflow from PyPI, the packages are installed from the .whl and .tar.gz packages placed
in the 'docker-context-files' folder. The same flag can be used during entering the image in
the CI image - in this case also the .whl and .tar.gz files will be installed automatically
-C, --force-clean-images
Force build images with cache disabled. This will remove the pulled or build images
and start building images from scratch. This might take a long time.
-r, --skip-rebuild-check
Skips checking image for rebuilds. It will use whatever image is available locally/pulled.
-L, --build-cache-local
Uses local cache to build images. No pulled images will be used, but results of local
builds in the Docker cache are used instead. This will take longer than when the pulled
cache is used for the first time, but subsequent '--build-cache-local' builds will be
faster as they will use mostly the locally build cache.
This is default strategy used by the Production image builds.
-U, --build-cache-pulled
Uses images pulled from GitHub Container Registry to build images.
Those builds are usually faster than when ''--build-cache-local'' with the exception if
the registry images are not yet updated. The images are updated after successful merges
to main.
This is default strategy used by the CI image builds.
-X, --build-cache-disabled
Disables cache during docker builds. This is useful if you want to make sure you want to
rebuild everything from scratch.
This strategy is used by default for both Production and CI images for the scheduled
(nightly) builds in CI.
-g, --github-repository GITHUB_REPOSITORY
GitHub repository used to pull, push images.
Default: apache/airflow.
-v, --verbose
Show verbose information about executed docker, kind, kubectl, helm commands. Useful for
debugging - when you run breeze with --verbose flags you will be able to see the commands
executed under the hood and copy&paste them to your terminal to debug them more easily.
Note that you can further increase verbosity and see all the commands executed by breeze
by running 'export VERBOSE_COMMANDS="true"' before running breeze.
--dry-run-docker
Only show docker commands to execute instead of actually executing them. The docker
commands are printed in yellow color.
```
| https://github.com/apache/airflow/issues/19970 | https://github.com/apache/airflow/pull/20338 | 919ff4567d86a09fb069dcfd84885b496229eea9 | 95740a87083c703968ce3da45b15113851ef09f7 | "2021-12-02T13:31:21Z" | python | "2022-01-05T17:38:05Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,966 | [".pre-commit-config.yaml", "dev/breeze/doc/BREEZE.md"] | Breeze: Make Breeze works for Windows seemleasly | The goal to achieve:
We would like to have `./Breeze2.exe` in the main directory of Airlfow which should do the same as `python Breeze2` now - firing the Breeze command line and managing the virtualenv for it.
We need to have:
* colors in terminal
* possibility to use commands and flags
* [stretch goal] autocomplete in Windows (we might separate it out to a separate task later)
| https://github.com/apache/airflow/issues/19966 | https://github.com/apache/airflow/pull/20148 | 97261c642cbf07db91d252cf6b0b7ff184cd64c6 | 9db894a88e04a71712727ef36250a29b2e34f4fe | "2021-12-02T13:15:37Z" | python | "2022-01-03T16:44:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,955 | ["airflow/www/views.py"] | Add hour and minute to time format on x-axis in Landing Times | ### Description
It would be great if the date and time format of x-axis in Landing Times is `%d %b %Y, %H:%M` instead of `%d %b %Y` to see `execution_date` of task instances.
### Use case/motivation
For x-axis of all line chart using `nvd3.lineChart`, the date and time format has only date without time because its format is `%d %b %Y` by default when `x_is_date` is `True`. From what I have seen in version 1.7, the date and time format was like `%d %b %Y, %H:%M`. It has been changed as Highcharts was no longer used since version 1.8. It is simple to fix by injecting `x_axis_format="%d %b %Y, %H:%M"` where every `nvd3.lineChart` class is initialized.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19955 | https://github.com/apache/airflow/pull/20002 | 2c80aaab4f486688fa4b8e252e1147a5dfabee54 | 6a77e849be3a505bf2636e8224862d77a4719621 | "2021-12-02T02:32:25Z" | python | "2021-12-16T15:32:37Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,922 | ["airflow/operators/python.py", "tests/operators/test_python.py"] | ShortCircuitOperator does not save to xcom | ### Description
`ShortCircuitOperator` does not return any value regardless of the result.
Still, if `condition` evaluates to `falsey`, it can be useful to store/save the result of the condition to XCom so that downstream tasks. can use.
Many objects evaluates to True/False, not just booleans - see https://docs.python.org/3/library/stdtypes.html
### Use case/motivation
The change is trivial and would allow to combine a Python task and ShortCircuit into one.:
1. if callable returns None (or False) -> skip
2. if callable returns non-empty object (or True) -> continue. The proposed change is to pass on the condition to XCom.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19922 | https://github.com/apache/airflow/pull/20071 | 993ed933e95970d14e0b0b5659ad28f15a0e5fde | 4f964501e5a6d5685c9fa78a6272671a79b36dd1 | "2021-12-01T05:03:27Z" | python | "2021-12-11T16:27:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,917 | ["airflow/models/dag.py", "airflow/models/dagrun.py", "tests/models/test_dag.py", "tests/models/test_dagrun.py", "tests/models/test_taskinstance.py"] | Task-level end_date stops entire DAG | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Ubuntu 19.10
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
If a task has an `end_date`, the whole DAG stops at that time instead of only that task.
### What you expected to happen
The task with the `end_date` should stop at that time and the rest should continue.
### How to reproduce
```
from datetime import datetime, timedelta
from airflow.models import DAG
from airflow.operators.dummy import DummyOperator
default_args = {
"owner": "me",
"email": [""],
"start_date": datetime(2021, 11, 20),
}
dag = DAG(
"end_date_test",
default_args=default_args,
schedule_interval="@daily",
)
task1 = DummyOperator(
task_id="no_end_date",
dag=dag
)
task2 = DummyOperator(
task_id="special_start_end",
start_date=datetime(2021, 11, 22),
end_date=datetime(2021, 11, 24),
dag=dag,
)
```

### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19917 | https://github.com/apache/airflow/pull/20920 | f612a2f56add4751e959625c49368d09a2a47d55 | 85871eba420f3324432f55f74fe57005ff47a21c | "2021-12-01T02:44:43Z" | python | "2022-03-27T19:11:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,903 | ["airflow/decorators/task_group.py", "tests/utils/test_task_group.py"] | Error setting dependencies on task_group defined using the decorator | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
MacOS 11.6.1
### Versions of Apache Airflow Providers
$ pip freeze | grep airflow
apache-airflow==2.2.2
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Other
### Deployment details
`airflow standalone`
### What happened
```
AttributeError: 'NoneType' object has no attribute 'update_relative'
```
### What you expected to happen
Task group should be set as downstream of `start` task, and upstream of `end` task
### How to reproduce
* Add the following code to dags folder
```python
from datetime import datetime
from airflow.decorators import dag, task, task_group
@dag(start_date=datetime(2023, 1, 1), schedule_interval="@once")
def test_dag_1():
@task
def start():
pass
@task
def do_thing(x):
print(x)
@task_group
def do_all_things():
do_thing(1)
do_thing(2)
@task
def end():
pass
start() >> do_all_things() >> end()
test_dag_1()
```
* Run `airflow standalone`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19903 | https://github.com/apache/airflow/pull/20671 | f2039b4c9e15b514661d4facbd710791fe0a2ef4 | 384fa4a87dfaa79a89ad8e18ac1980e07badec4b | "2021-11-30T19:55:25Z" | python | "2022-01-08T04:09:03Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,902 | ["airflow/decorators/base.py", "tests/utils/test_task_group.py"] | DuplicateTaskIdFound when reusing tasks with task_group decorator | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
MacOS 11.6.1
### Versions of Apache Airflow Providers
$ pip freeze | grep airflow
apache-airflow==2.2.2
apache-airflow-providers-celery==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Other
### Deployment details
`airflow standalone`
### What happened
Exception raised :
```
raise DuplicateTaskIdFound(f"Task id '{key}' has already been added to the DAG")
airflow.exceptions.DuplicateTaskIdFound: Task id 'do_all_things.do_thing__1' has already been added to the DAG
```
### What you expected to happen
_No response_
### How to reproduce
* Add the following python file to the `dags` folder:
```python
from datetime import datetime
from airflow.decorators import dag, task, task_group
@dag(start_date=datetime(2023, 1, 1), schedule_interval="@once")
def test_dag_1():
@task
def start():
pass
@task
def do_thing(x):
print(x)
@task_group
def do_all_things():
for i in range(5):
do_thing(i)
@task
def end():
pass
start() >> do_all_things() >> end()
test_dag_1()
```
* Start airflow by running `airflow standalone`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19902 | https://github.com/apache/airflow/pull/20870 | 077bacd9e7cc066365d3f201be2a5d9b108350fb | f881e1887c9126408098919ecad61f94e7a9661c | "2021-11-30T19:44:46Z" | python | "2022-01-18T09:37:35Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,901 | ["airflow/models/dagrun.py", "tests/jobs/test_scheduler_job.py"] | No scheduling when max_active_runs is 1 | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
Since version 2.2 a field `DAG.next_dagrun_create_after` is not calculated when `DAG.max_active_runs` is 1.
### What you expected to happen
https://github.com/apache/airflow/blob/fca2b19a5e0c081ab323479e76551d66ab478d07/airflow/models/dag.py#L2466
If this condition is evaluated when a state is "running" then it is incorrect.
### How to reproduce
Create a DAG with a `schedule_interval` and `max_active_runs=1.`
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19901 | https://github.com/apache/airflow/pull/21413 | 5fbf2471ab4746f5bc691ff47a7895698440d448 | feea143af9b1db3b1f8cd8d29677f0b2b2ab757a | "2021-11-30T18:30:43Z" | python | "2022-02-24T07:12:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,891 | [".github/workflows/ci.yml", "airflow/api/common/experimental/get_dag_runs.py", "airflow/jobs/base_job.py", "airflow/models/base.py", "airflow/models/dagrun.py", "airflow/providers/microsoft/winrm/hooks/winrm.py", "airflow/providers/ssh/hooks/ssh.py", "tests/providers/amazon/aws/hooks/test_eks.py", "tests/providers/google/cloud/hooks/test_gcs.py", "tests/www/api/experimental/test_dag_runs_endpoint.py"] | Re-enable MyPy | # Why Mypy re-enable
For a few weeks MyPy checks have been disabled after the switch to Python 3.7 (per https://github.com/apache/airflow/pull/19317).
We should, however, re-enable it back as it is very useful in catching a number of mistakes.
# How does it work
We 've re-added the mypy pre-commit now - with mypy bumped to 0.910. This version detects far more errors and we should fix them all before we switch the CI check back.
* mypy will be running for incremental changes in pre-commit, same as before. This will enable incremental fixes of the code changed by committers who use pre-commits locally
* mypy on CI runs in non-failing mode. When the main pre-commit check is run, mypy is disabled, but then it is run as a separate step (which does not fail but will show the result of running mypy on all our code). This will enable us to track the progress of fixes
# Can I help with the effort, you ask?
We started concerted effort now and incrementally fix all the mypy incompatibilities - ideally package/by/package to avoid huge code reviews. We'd really appreciate a number of people to contribute, so that we can re-enable mypy back fully and quickly :).
# How can I help?
What you need is:
* checkout `main`
* `./breeeze build-image`
* `pip install pre-commit`
* `pre-commit install`
This will enable automated checks for when you do a regular contribution. When you make your change, any MyPy issues will be reporteed and you need to fix them all to commit. You can also commit with `--no-verify` flag to skip that, bu, well, if you can improve airlfow a little - why not?
# How can I help more ?
You can add PRs that are fixing whole packages, without contributing features or bugfixes. Please refer to this issue #19891 and ideally comment below in the issue that you want to take care of a package (to avoid duplicate work).
An easy way to run MyPy check for package can be done either from the host:
```
find DIRECTORY -name "*.py" | xargs pre-commit run mypy --files
```
or from ./breeze shell:
```
mypy --namespace-packages DIRECTORY
```
# Current list of mypy PRs:
https://github.com/apache/airflow/pulls?q=is%3Aopen+is%3Apr+label%3Amypy
# Remaining packages
Here is the list of remaining packages to be "mypy compliant" generated with:
```
pre-commit run mypy --all-files 2>&1 | sed -r "s/\x1B\[([0-9]{1,2}(;[0-9]{1,2})?)?[mGK]//g" | grep "error:" | sort | awk 'FS=":" { print $1 }' | xargs dirname | sort | uniq -c | xargs -n 2 printf "* [ ] (%4d) %s\n"
```
* [ ] ( 1) airflow/api/common/experimental
* [ ] ( 1) airflow/contrib/sensors
* [ ] ( 1) airflow/example_dags
* [ ] ( 1) airflow/jobs
* [ ] ( 4) airflow/models
* [ ] ( 1) airflow/providers/microsoft/winrm/hooks
* [ ] ( 1) airflow/providers/ssh/hooks
* [ ] ( 1) tests/providers/amazon/aws/hooks
* [ ] ( 1) tests/providers/google/cloud/hooks
### Committer
- [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. | https://github.com/apache/airflow/issues/19891 | https://github.com/apache/airflow/pull/21020 | 07ea9fcaa10fc1a8c43ef5f627360d4adb12115a | 9ed9b5170c8dbb11469a88c41e323d8b61a1e7e6 | "2021-11-30T12:07:37Z" | python | "2022-01-24T21:39:08Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,877 | ["airflow/providers/amazon/aws/hooks/emr.py", "tests/providers/amazon/aws/hooks/test_emr_containers.py"] | Refactor poll_query_status in EMRContainerHook | ### Body
The goal is to refactor the code so that we can remove this TODO
https://github.com/apache/airflow/blob/7640ba4e8ee239d6e2bbf950d53d624b9df93059/airflow/providers/amazon/aws/hooks/emr_containers.py#L174-L176
More information about the concerns can be found on https://github.com/apache/airflow/pull/16766#discussion_r668089559
### Committer
- [X] I acknowledge that I am a maintainer/committer of the Apache Airflow project. | https://github.com/apache/airflow/issues/19877 | https://github.com/apache/airflow/pull/21423 | 064b39f3faae26e5b1312510142b50765e58638b | c8d49f63ca60fa0fb447768546c2503b746a66dd | "2021-11-29T13:43:21Z" | python | "2022-03-08T12:59:34Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,850 | ["airflow/providers/amazon/aws/hooks/batch_client.py", "airflow/providers/amazon/aws/sensors/batch.py", "airflow/providers/amazon/provider.yaml", "tests/providers/amazon/aws/sensors/test_batch.py"] | AWS Batch Job Sensor | ### Description
Add a sensor for AWS Batch jobs that will poll the job until it reaches a terminal state.
### Use case/motivation
This feature will enable DAGs to track the status of batch jobs that are submitted in an upstream task that doesn't use the BatchOperator. An example use case - The batch job is submitted by an upstream PythonOperator, with its own functional logic, that returns the job_id of the submitted job.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19850 | https://github.com/apache/airflow/pull/19885 | 7627de383e5cdef91ca0871d8107be4e5f163882 | af28b4190316401c9dfec6108d22b0525974eadb | "2021-11-27T05:11:15Z" | python | "2021-12-05T21:52:30Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,801 | ["airflow/sensors/base.py", "tests/sensors/test_base.py"] | Airflow scheduler crashed with TypeError: '>=' not supported between instances of 'datetime.datetime' and 'NoneType' | ### Apache Airflow version
2.1.4
### Operating System
Ubuntu 20.04.3 LTS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
Airflow scheduler crashed with following exception
```
[2021-11-23 22:41:16,528] {scheduler_job.py:662} INFO - Starting the scheduler
[2021-11-23 22:41:16,528] {scheduler_job.py:667} INFO - Processing each file at most -1 times
[2021-11-23 22:41:16,639] {manager.py:254} INFO - Launched DagFileProcessorManager with pid: 19
[2021-11-23 22:41:16,641] {scheduler_job.py:1217} INFO - Resetting orphaned tasks for active dag runs
[2021-11-23 22:41:16,644] {settings.py:51} INFO - Configured default timezone Timezone('Etc/GMT-7')
[2021-11-23 22:41:19,016] {scheduler_job.py:711} ERROR - Exception when executing SchedulerJob._run_scheduler_loop
Traceback (most recent call last):
File "/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 695, in _execute
self._run_scheduler_loop()
File "/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 788, in _run_scheduler_loop
num_queued_tis = self._do_scheduling(session)
File "/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 901, in _do_scheduling
callback_to_run = self._schedule_dag_run(dag_run, session)
File "/usr/local/lib/python3.8/dist-packages/airflow/jobs/scheduler_job.py", line 1143, in _schedule_dag_run
schedulable_tis, callback_to_run = dag_run.update_state(session=session, execute_callbacks=False)
File "/usr/local/lib/python3.8/dist-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/airflow/models/dagrun.py", line 438, in update_state
info = self.task_instance_scheduling_decisions(session)
File "/usr/local/lib/python3.8/dist-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/airflow/models/dagrun.py", line 539, in task_instance_scheduling_decisions
schedulable_tis, changed_tis = self._get_ready_tis(scheduleable_tasks, finished_tasks, session)
File "/usr/local/lib/python3.8/dist-packages/airflow/models/dagrun.py", line 565, in _get_ready_tis
if st.are_dependencies_met(
File "/usr/local/lib/python3.8/dist-packages/airflow/utils/session.py", line 67, in wrapper
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 890, in are_dependencies_met
for dep_status in self.get_failed_dep_statuses(dep_context=dep_context, session=session):
File "/usr/local/lib/python3.8/dist-packages/airflow/models/taskinstance.py", line 911, in get_failed_dep_statuses
for dep_status in dep.get_dep_statuses(self, session, dep_context):
File "/usr/local/lib/python3.8/dist-packages/airflow/ti_deps/deps/base_ti_dep.py", line 101, in get_dep_statuses
yield from self._get_dep_statuses(ti, session, dep_context)
File "/usr/local/lib/python3.8/dist-packages/airflow/ti_deps/deps/ready_to_reschedule.py", line 66, in _get_dep_statuses
if now >= next_reschedule_date:
TypeError: '>=' not supported between instances of 'datetime.datetime' and 'NoneType'
[2021-11-23 22:41:20,020] {process_utils.py:100} INFO - Sending Signals.SIGTERM to GPID 19
```
### What you expected to happen
_No response_
### How to reproduce
Define a `BaseSensorOperator` task with large `poke_interval` with `reschedule` mode
```
BaseSensorOperator(
task_id='task',
poke_interval=863998946,
mode='reschedule',
dag=dag
)
```
### Anything else
_No response_
### Are you willing to submit PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19801 | https://github.com/apache/airflow/pull/19821 | 9c05a951175c231478cbc19effb0e2a4cccd7a3b | 2213635178ca9d0ae96f5f68c88da48f7f104bf1 | "2021-11-24T03:30:05Z" | python | "2021-12-13T09:38:35Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,785 | ["airflow/utils/task_group.py", "tests/utils/test_task_group.py"] | Applying labels to task groups shows a cycle in the graph view for the dag | ### Apache Airflow version
2.2.2
### Operating System
Docker (debian:buster)
### Versions of Apache Airflow Providers
N/A
### Deployment
Astronomer
### Deployment details
run airflow with this dag
```python3
with DAG(
dag_id="label_bug_without_chain"
) as dag:
with TaskGroup(group_id="group1") as taskgroup1:
t1 = DummyOperator(task_id="dummy1")
t2 = DummyOperator(task_id="dummy2")
t3 = DummyOperator(task_id="dummy3")
t4 = DummyOperator(task_id="dummy4")
chain([Label("branch three")], taskgroup1, t4,)
```
### What happened
expanded task views look like they have cycles
<img width="896" alt="Screen Shot 2021-11-22 at 2 33 49 PM" src="https://user-images.githubusercontent.com/17841735/143083099-d250fd7e-963f-4b34-b544-405b51ee2859.png">
### What you expected to happen
The task group shouldn't display as if it has loops in it.
### How to reproduce
View the dag shown in the deployment details.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19785 | https://github.com/apache/airflow/pull/24847 | 96b01a8012d164df7c24c460149d3b79ecad3901 | efc05a5f0b3d261293c2efaf6771e4af9a2f324c | "2021-11-23T18:39:01Z" | python | "2022-07-05T15:40:00Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,757 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"] | Change the default for `dag_processor_manager_log_location` | ### Description
Should the default for the [dag_processor_manager_log_location](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#dag-processor-manager-log-location) be `{BASE_LOG_FOLDER}/dag_processor_manager/dag_processor_manager.log` instead of `{AIRFLOW_HOME}/logs/dag_processor_manager/dag_processor_manager.log`?
### Use case/motivation
I'm running the k8s executor and we are changing our security profile on the pods such that the filesystem is readonly except for `/tmp`. I started out by changing `base_log_folder` and I spent a while trying to debug parts of my logging config that were still trying to write to `{AIRFLOW_HOME}/logs`
I found that the processor config was the issue because the default location was `{AIRFLOW_HOME}/logs/dag_processor_manager/dag_processor_manager.log` ([here](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#dag-processor-manager-log-location))
Maybe it is fine as is but I found it hard to debug
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19757 | https://github.com/apache/airflow/pull/19793 | 6c80149d0abf84caec8f4c1b4e8795ea5923f89a | 00fd3af52879100d8dbca95fd697d38fdd39e60a | "2021-11-22T21:42:23Z" | python | "2021-11-24T18:40:13Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,754 | ["airflow/sensors/external_task.py", "newsfragments/23647.bugfix.rst", "tests/sensors/test_external_task_sensor.py"] | ExternalTaskSensor should skip if soft_fail=True and external task in one of the failed_states | ### Apache Airflow version
2.1.4
### Operating System
Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Astronomer
### Deployment details
_No response_
### What happened
I ran into a scenario where if I use an `ExternalTaskSensor` and set it to `soft_fail=True` but also set it to `failed_states=['skipped']` I would expect if the external task skipped then to mark this sensor as skipped, however for the `failed_states` check in the poke method if it is in one of those states it will explicitly fail with an `AirflowException`.
Wouldn't it make more sense to skip because of the `soft_fail`?
### What you expected to happen
The `ExternalTaskSensor` task should skip
### How to reproduce
1. Add a DAG with a task that is set to skip, such as this `BashOperator` task set to skip taken from https://airflow.apache.org/docs/apache-airflow/stable/howto/operator/bash.html#skipping:
```
this_will_skip = BashOperator(
task_id='this_will_skip',
bash_command='echo "hello world"; exit 99;',
dag=dag,
)
```
2. Add a second DAG with an `ExternalTaskSensor`
3. Set that sensor to have `external_dag_id` be the other DAG and `external_task_id` be the skipped task in that other DAG and `failed_states=['skipped']` and `soft_fail=True`
4. The `ExternalTaskSensor` fails instead of skips
### Anything else
I don't know what is desirable for most Airflow users:
1. To have `soft_fail` to only cause skips if the sensor times out? (like it seems to currently do)
2. To have `ExternalTaskSensor` with `soft_fail` to skip any time it would otherwise fail, such as the external task being in one of the `failed_states`?
3. To have some other way for the `ExternalTaskSensor` to skip if the external task skipped?
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19754 | https://github.com/apache/airflow/pull/23647 | 7de050ceeb381fb7959b65acd7008e85b430c46f | 1b345981f6e8e910b3542ec53829e39e6c9b6dba | "2021-11-22T19:06:38Z" | python | "2022-06-24T13:50:13Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,716 | ["airflow/macros/__init__.py", "airflow/models/taskinstance.py", "airflow/operators/python.py", "airflow/utils/context.py", "airflow/utils/context.pyi", "airflow/utils/operator_helpers.py", "tests/models/test_taskinstance.py", "tests/operators/test_python.py", "tests/providers/amazon/aws/sensors/test_s3_key.py", "tests/providers/papermill/operators/test_papermill.py"] | [Airflow 2.2.2] execution_date Proxy object - str formatting error | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Ubuntu 18.04.6
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
The deprecated variable `execution_date` raises an error when used in an f string template with date string formatting.
```python
In [1]: execution_date
DeprecationWarning: Accessing 'execution_date' from the template is deprecated and will be removed in a future version. Please use 'logical_date' or 'data_interval_start' instead.
Out[1]: <Proxy at 0x7fb6f9af81c0 wrapping DateTime(2021, 11, 18, 0, 0, 0, tzinfo=Timezone('UTC')) at 0x7fb6f9aeff90 with factory <function TaskInstance.get_template_context.<locals>.deprecated_proxy.<locals>.deprecated_func at 0x7fb6f98699d0>>
In [2]: f"{execution_date:%Y-%m-%d}"
---------------------------------------------------------------------------
TypeError Traceback (most recent call last)
----> 1 f"{execution_date:%Y-%m-%d}"
TypeError: unsupported format string passed to Proxy.__format__
```
### What you expected to happen
Executing `f"{execution_date:%Y-%m-%d}"` should return a string and not raise an error.
### How to reproduce
```python
from datetime import datetime
from airflow import DAG
from airflow.operators.python import PythonOperator
def test_str_fmt(execution_date: datetime):
return f"{execution_date:%Y-%m-%d}"
dag = DAG(
dag_id="Test_Date_String",
schedule_interval="@daily",
catchup=False,
default_args={
"depends_on_past": False,
"start_date": datetime(2021, 11, 1),
"email": None,
"email_on_failure": False,
"email_on_retry": False,
"retries": 0,
},
)
with dag:
test_task = PythonOperator(
task_id="test_task",
python_callable=test_str_fmt,
)
```
### Anything else
```python
from datetime import datetime
...
datetime.fromisoformat(next_ds)
TypeError: fromisoformat: argument must be str
```
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19716 | https://github.com/apache/airflow/pull/19886 | f6dca1fa5e70ef08798adeb5a6bfc70f41229646 | caaf6dcd3893bbf11db190f9969af9aacc773255 | "2021-11-19T20:13:25Z" | python | "2021-12-01T07:14:56Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,699 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/executors/celery_executor.py", "tests/executors/test_celery_executor.py"] | task_instances stuck in "queued" and are missing corresponding celery_taskmeta entries | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Linux Mint 20.2
### Versions of Apache Airflow Providers
apache-airflow-providers-celery = "2.1.0"
apache-airflow-providers-papermill = "^2.1.0"
apache-airflow-providers-postgres = "^2.2.0"
apache-airflow-providers-google = "^6.1.0"
### Deployment
Docker-Compose
### Deployment details
Docker-compose deploys into our GCP k8s cluster
### What happened
Hi,
we're running Airflow for our ETL pipelines.
Our DAGs run in parallel and we spawn a fair bit of parallel DAGs and tasks every morning for our pipelines.
We run our Airflow in a k8s cluster in GCP and we use Celery for our executors.
And we use autopilot to dynamically scale up and down the cluster as the workload increases or decreases, thereby sometimes tearing down airflow workers.
Ever since upgrading to Airflow 2.0 we've had a lot of problems with tasks getting stuck in "queued" or "running", and we've had to clean it up by manually failing the stuck tasks and re-running the DAGs.
Following the discussions here over the last months it looks we've not been alone :-)
But, after upgrading to Airflow 2.2.1 we saw a significant decrease in the number of tasks getting stuck (yay!), something we hoped for given the bug fixes for the scheduler in that release.
However, we still have a few tasks getting stuck (Stuck = "Task in queued") on most mornings that require the same manual intervention.
I've started digging in the Airflow DB trying to see where there's a discrepancy, and every time a task gets stuck it's missing a correspondning task in the table "celery_taskmeta".
This is a consistent pattern for the tasks that are stuck with us at this point. The task has rows in the tables "task_instance", "job", and "dag_run" with IDs referencing each other.
But the "external_executor_id" in "task_instance" is missing a corresponding entry in the "celery_taskmeta" table. So nothing ever gets executed and the task_instance is forever stuck in "queued" and never cleaned up by the scheduler.
I can see in "dag_run::last_scheduling_decision" that the scheduler is continuously re-evaluating this task as the timestamp is updated, so it's inspecting it at least, but it leaves everything in the "queued" state.
The other day I bumped our Airflow to 2.2.2, but we still get the same behavior.
And finally, whenever we get tasks that are stuck in "Queued" in this way they usually occur within the same few seconds timestamp-wise, and it correlates timewise to a timestamp when autopilot scaled down the number of airflow-workers.
If the tasks end up in this orphaned/queued state then they never get executed and are stuck until we fail them. Longest I've seen so far is a few days in this state until the task was discovered.
Restarting the scheduler does not resolve this issue and tasks are still stuck in "queued" afterwards.
Would it be possible (and a good idea?) to include in the scheduler a check if a "task_instance" row has a corresponding row in the "celery_taskmeta", and if its missing in "celery_taskmeta" after a given amount of time clean it up?
After reading about and watching Ash Berlin-Taylor's most excellent video on a deep dive into the Airflow's scheduler this does seem exactly like the check that we should add to the scheduler?
Also if there's any data I can dig out and provide for this, don't hesitate to let me know.
### What you expected to happen
I expect orphaned tasks in the state queued and that are missing a corresponding entry in celery_taskmeta to be cleaned up and re-executed by the scheduler.
### How to reproduce
Currently no deterministic way to reproduce other than a large amount of tasks and then remove a worker at just the right time.
Occurs every morning in a handful of tasks, but no deterministic way to reproduce it.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19699 | https://github.com/apache/airflow/pull/19769 | 2b4bf7fe67fc656ceb7bdaad36453b7a5b83ef04 | 14ee831c7ad767e31a3aeccf3edbc519b3b8c923 | "2021-11-19T07:00:42Z" | python | "2022-01-14T08:55:15Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,688 | ["docs/helm-chart/airflow-configuration.rst"] | Airflow does not load default connections | ### Apache Airflow version
2.2.1
### Operating System
linux
### Versions of Apache Airflow Providers
_No response_
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
deploy with helm in EKS.
### What happened
When I deployed my airflow via helm, I wanted to use the aws_default connection in my dags, but passing this connection airflow logs that this connection does not exists. So I looked in airflow UI and in postgres connection table and this connection did not exist.
After that I checked the AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS env var and it was equals true.
### What you expected to happen
I expected that the default connections was created with the initial deploy.
### How to reproduce
You just need to deploy the airflow via helm chart and check that value of the AIRFLOW__CORE__LOAD_DEFAULT_CONNECTIONS env var, goes to the connection table in the database and check that any connection was created.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19688 | https://github.com/apache/airflow/pull/19708 | d69b4c9dc82b6c35c387bb819b95cf41fb974ab8 | 1983bf95806422146a3750945a65fd71364dc973 | "2021-11-18T18:33:14Z" | python | "2021-11-24T10:46:39Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,654 | ["chart/templates/_helpers.yaml", "chart/values.schema.json", "chart/values.yaml", "tests/charts/test_pgbouncer.py"] | Pgbouncer auth type cannot be configured | ### Official Helm Chart version
1.3.0 (latest released)
### Apache Airflow version
2.2.1
### Kubernetes Version
1.20.9
### Helm Chart configuration
_No response_
### Docker Image customisations
_No response_
### What happened
The default pgbouncer config generated by the helm chart looks something like
```
[databases]
...
[pgbouncer]
pool_mode = transaction
listen_port = 6543
listen_addr = *
auth_type = md5
auth_file = /etc/pgbouncer/users.txt
stats_users = <user>
ignore_startup_parameters = extra_float_digits
max_client_conn = 200
verbose = 0
log_disconnections = 0
log_connections = 0
server_tls_sslmode = require
server_tls_ciphers = normal
```
If the database to connect against is Azure Postgresl, the auth type `md5` [does not seem](https://github.com/pgbouncer/pgbouncer/issues/325) to be supported. Hence, we need to add a configuration flag to change the auth type to something else in the helm chart.
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19654 | https://github.com/apache/airflow/pull/21999 | 3c22565ac862cfe3a3a28a097dc1b7c9987c5d76 | f482ae5570b1a3979ee6b382633e7181a533ba93 | "2021-11-17T14:45:33Z" | python | "2022-03-26T19:55:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,647 | ["airflow/www/extensions/init_appbuilder.py"] | Wrong path for assets | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
macOS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
Running things locally with pyenv as well as Breeze.
### What happened
I have pulled the latest version last night but that doesn't seem to fix the issue, neither by local pyenv nor by Docker.
I cannot copy separate logs from tmux, iTerm doesn't seem to allow that, but I can show the logs from the boot process of the breeze env:
```
Good version of docker 20.10.8.
f321cbfd33eeb8db8effc8f9bc4c3d5317758927da26abeb4c08b14fad09ff6b
f321cbfd33eeb8db8effc8f9bc4c3d5317758927da26abeb4c08b14fad09ff6b
No need to pull the image. Yours and remote cache hashes are the same!
The CI image for Python python:3.8-slim-buster image likely needs to be rebuild
The files were modified since last build: setup.py setup.cfg Dockerfile.ci .dockerignore scripts/docker/compile_www_assets.sh scripts/docker/common.sh scripts/docker/install_additional_dependencies.sh scripts/docker/install_airflow.sh scripts/docker/install_airflow_dependencies_from_branch_tip.sh scripts/docker/install_from_docker_context_files.sh scripts/docker/install_mysql.sh airflow/www/package.json airflow/www/yarn.lock airflow/www/webpack.config.js airflow/ui/package.json airflow/ui/yarn.lock
WARNING!!!!:Make sure that you rebased to latest upstream before rebuilding or the rebuild might take a lot of time!
Please confirm pull and rebuild image CI-python3.8 (or wait 4 seconds to skip it). Are you sure? [y/N/q]
The answer is 'no'. Skipping pull and rebuild image CI-python3.8.
@&&&&&&@
@&&&&&&&&&&&@
&&&&&&&&&&&&&&&&
&&&&&&&&&&
&&&&&&&
&&&&&&&
@@@@@@@@@@@@@@@@ &&&&&&
@&&&&&&&&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&
&&&&&&&&&
&&&&&&&&&&&&
@@&&&&&&&&&&&&&&&@
@&&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&
&&&&&&&&&&&&&&&&&&&&&&&& &&&&&&
&&&&&&
&&&&&&&
@&&&&&&&&
@&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&&
@&&&@ && @&&&&&&&&&&& &&&&&&&&&&&& && &&&&&&&&&& &&& &&& &&&
&&& &&& && @&& &&& && && &&& &&&@ &&& &&&&& &&&
&&& &&& && @&&&&&&&&&&&& &&&&&&&&&&& && && &&& &&& &&& &&@ &&&
&&&&&&&&&&& && @&&&&&&&&& && && &&@ &&& &&@&& &&@&&
&&& &&& && @&& &&&@ && &&&&&&&&&&& &&&&&&&&&&&& &&&& &&&&
&&&&&&&&&&&& &&&&&&&&&&&& &&&&&&&&&&&@ &&&&&&&&&&&& &&&&&&&&&&& &&&&&&&&&&&
&&& &&& && &&& && &&& &&&& &&
&&&&&&&&&&&&@ &&&&&&&&&&&& &&&&&&&&&&& &&&&&&&&&&& &&&& &&&&&&&&&&
&&& && && &&&& && &&& &&&& &&
&&&&&&&&&&&&& && &&&&@ &&&&&&&&&&&@ &&&&&&&&&&&& @&&&&&&&&&&& &&&&&&&&&&&
Use CI image.
Branch name: main
Docker image: ghcr.io/apache/airflow/main/ci/python3.8:latest
Airflow source version: 2.3.0.dev0
Python version: 3.8
Backend: mysql 5.7
####################################################################################################
Airflow Breeze CHEATSHEET
####################################################################################################
Adding breeze to your path:
When you exit the environment, you can add sources of Airflow to the path - you can
run breeze or the scripts above from any directory by calling 'breeze' commands directly
export PATH=${PATH}:"/Users/burakkarakan/Code/anything-else/airflow"
####################################################################################################
Port forwarding:
Ports are forwarded to the running docker containers for webserver and database
* 12322 -> forwarded to Airflow ssh server -> airflow:22
* 28080 -> forwarded to Airflow webserver -> airflow:8080
* 25555 -> forwarded to Flower dashboard -> airflow:5555
* 25433 -> forwarded to Postgres database -> postgres:5432
* 23306 -> forwarded to MySQL database -> mysql:3306
* 21433 -> forwarded to MSSQL database -> mssql:1443
* 26379 -> forwarded to Redis broker -> redis:6379
Here are links to those services that you can use on host:
* ssh connection for remote debugging: ssh -p 12322 [email protected] pw: airflow
* Webserver: http://127.0.0.1:28080
* Flower: http://127.0.0.1:25555
* Postgres: jdbc:postgresql://127.0.0.1:25433/airflow?user=postgres&password=airflow
* Mysql: jdbc:mysql://127.0.0.1:23306/airflow?user=root
* Redis: redis://127.0.0.1:26379/0
####################################################################################################
You can setup autocomplete by running 'breeze setup-autocomplete'
####################################################################################################
You can toggle ascii/cheatsheet by running:
* breeze toggle-suppress-cheatsheet
* breeze toggle-suppress-asciiart
####################################################################################################
Checking resources.
* Memory available 5.9G. OK.
* CPUs available 3. OK.
WARNING!!!: Not enough Disk space available for Docker.
At least 40 GBs recommended. You have 28G
WARNING!!!: You have not enough resources to run Airflow (see above)!
Please follow the instructions to increase amount of resources available:
Please check https://github.com/apache/airflow/blob/main/BREEZE.rst#resources-required for details
Good version of docker-compose: 1.29.2
WARNING: The ENABLE_TEST_COVERAGE variable is not set. Defaulting to a blank string.
Creating network "docker-compose_default" with the default driver
Creating volume "docker-compose_sqlite-db-volume" with default driver
Creating volume "docker-compose_postgres-db-volume" with default driver
Creating volume "docker-compose_mysql-db-volume" with default driver
Creating volume "docker-compose_mssql-db-volume" with default driver
Creating docker-compose_mysql_1 ... done
Creating docker-compose_airflow_run ... done
Airflow home: /root/airflow
Airflow sources: /opt/airflow
Airflow core SQL connection: mysql://root@mysql/airflow?charset=utf8mb4
Using already installed airflow version
No need for www assets recompilation.
===============================================================================================
Checking integrations and backends
===============================================================================================
MySQL: OK.
-----------------------------------------------------------------------------------------------
-----------------------------------------------------------------------------------------------
Starting Airflow
Your dags for webserver and scheduler are read from /files/dags directory
which is mounted from your <AIRFLOW_SOURCES>/files/dags folder
You can add /files/airflow-breeze-config directory and place variables.env
In it to make breeze source the variables automatically for you
You can add /files/airflow-breeze-config directory and place .tmux.conf
in it to make breeze use your local .tmux.conf for tmux
```
The logs say `No need for www assets recompilation.` which signals that the assets are already up-to-date. However, when I visit the page, the files are not there:
<img width="687" alt="image" src="https://user-images.githubusercontent.com/16530606/142189803-92dc5e9f-c940-4272-98a1-e5845344b62d.png">
- When I search the whole directory with `find . -name 'ab_filters.js'`, there's no file, which means the issue is not with the browser cache.
- Just in case there was another race condition, I ran the following command again to see if it'd generate the file for some reason: `./breeze initialize-local-virtualenv --python 3.8` but the result of the `find . -name 'ab_filters.js' ` is still empty.
- Then I ran `./airflow/www/compile_assets.sh` but that didn't make a difference as well, `find` is still empty.
Here's the output from the `compile_assets.sh`:
```
❯ ./airflow/www/compile_assets.sh
yarn install v1.22.17
[1/4] 🔍 Resolving packages...
success Already up-to-date.
✨ Done in 0.56s.
yarn run v1.22.17
$ NODE_ENV=production webpack --colors --progress
23% building 26/39 modules 13 active ...www/node_modules/babel-loader/lib/index.js??ref--5!/Users/burakkarakan/Code/anything-else/airflow/airflow/www/static/js/datetime_utils.jspostcss-modules-values: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
postcss-modules-local-by-default: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
modules-extract-imports: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
postcss-modules-scope: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
postcss-import-parser: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
postcss-icss-parser: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
postcss-url-parser: postcss.plugin was deprecated. Migration guide:
https://evilmartians.com/chronicles/postcss-8-plugin-migration
Hash: c303806b9efab4bf5dec
Version: webpack 4.44.2
Time: 4583ms
Built at: 11/17/2021 11:13:58 AM
Asset Size Chunks Chunk Names
../../../../licenses/LICENSES-ui.txt 52.8 KiB [emitted]
airflowDefaultTheme.9ef6a9e2f0de25c0b346.css 102 KiB 0 [emitted] [immutable] airflowDefaultTheme
airflowDefaultTheme.9ef6a9e2f0de25c0b346.js 4.15 KiB 0 [emitted] [immutable] airflowDefaultTheme
bootstrap-datetimepicker.min.css 7.54 KiB [emitted]
bootstrap-datetimepicker.min.js 37.1 KiB [emitted]
bootstrap3-typeahead.min.js 10 KiB [emitted]
calendar.5260e8f126017610ad73.css 1.06 KiB 1 [emitted] [immutable] calendar
calendar.5260e8f126017610ad73.js 15.4 KiB 1 [emitted] [immutable] calendar
codemirror.css 5.79 KiB [emitted]
codemirror.js 389 KiB [emitted] [big]
coffeescript-lint.js 1.43 KiB [emitted]
connectionForm.be3bf4692736d58cfdb0.js 12.8 KiB 2 [emitted] [immutable] connectionForm
css-lint.js 1.28 KiB [emitted]
d3-shape.min.js 29.1 KiB [emitted]
d3-tip.js 8.99 KiB [emitted]
d3.min.js 148 KiB [emitted]
dag.c0b8852bb690f83bb55e.js 20.4 KiB 3 [emitted] [immutable] dag
dagCode.98dce599559f03115f1f.js 6.48 KiB 4 [emitted] [immutable] dagCode
dagDependencies.c2cdb377b2d3b7be7d1b.js 10.4 KiB 5 [emitted] [immutable] dagDependencies
dagre-d3.core.min.js 27.5 KiB [emitted]
dagre-d3.core.min.js.map 26.3 KiB [emitted]
dagre-d3.min.js 708 KiB [emitted] [big]
dagre-d3.min.js.map 653 KiB [emitted] [big]
dags.0ca53db014891875da7d.css 2.59 KiB 6, 3, 18 [emitted] [immutable] dags
dags.0ca53db014891875da7d.js 45.9 KiB 6, 3, 18 [emitted] [immutable] dags
dataTables.bootstrap.min.css 4.13 KiB [emitted]
dataTables.bootstrap.min.js 1.93 KiB [emitted]
durationChart.ca520df04ff71dd5fab9.js 5.11 KiB 7 [emitted] [immutable] durationChart
flash.ab8296a74435427f9b53.css 1.36 KiB 8 [emitted] [immutable] flash
flash.ab8296a74435427f9b53.js 4.12 KiB 8 [emitted] [immutable] flash
gantt.d7989000350b53dc0855.css 1.1 KiB 9, 3, 18 [emitted] [immutable] gantt
gantt.d7989000350b53dc0855.js 42 KiB 9, 3, 18 [emitted] [immutable] gantt
graph.eaba1e30424750441353.css 2.37 KiB 10, 3, 18 [emitted] [immutable] graph
graph.eaba1e30424750441353.js 55.5 KiB 10, 3, 18 [emitted] [immutable] graph
html-lint.js 1.94 KiB [emitted]
ie.fc8f40153cdecb7eb0b3.js 16.4 KiB 11 [emitted] [immutable] ie
javascript-lint.js 2.11 KiB [emitted]
javascript.js 37.9 KiB [emitted]
jquery.dataTables.min.js 81.6 KiB [emitted]
jshint.js 1.22 MiB [emitted] [big]
json-lint.js 1.3 KiB [emitted]
lint.css 2.55 KiB [emitted]
lint.js 8.91 KiB [emitted]
loadingDots.e4fbfc09969e91db1f49.css 1.21 KiB 12 [emitted] [immutable] loadingDots
loadingDots.e4fbfc09969e91db1f49.js 4.13 KiB 12 [emitted] [immutable] loadingDots
main.216f001f0b6da7966a9f.css 6.85 KiB 13 [emitted] [immutable] main
main.216f001f0b6da7966a9f.js 16.4 KiB 13 [emitted] [immutable] main
manifest.json 3.31 KiB [emitted]
materialIcons.e368f72fd0a7e9a40455.css 109 KiB 14 [emitted] [immutable] materialIcons
materialIcons.e368f72fd0a7e9a40455.js 4.13 KiB 14 [emitted] [immutable] materialIcons
moment.f2be510679d38b9c54e9.js 377 KiB 15 [emitted] [immutable] [big] moment
nv.d3.min.css 8.13 KiB [emitted]
nv.d3.min.css.map 3.59 KiB [emitted]
nv.d3.min.js 247 KiB [emitted] [big]
nv.d3.min.js.map 1.86 MiB [emitted] [big]
oss-licenses.json 66.3 KiB [emitted]
redoc.standalone.js 970 KiB [emitted] [big]
redoc.standalone.js.LICENSE.txt 2.75 KiB [emitted]
redoc.standalone.js.map 3.23 MiB [emitted] [big]
switch.3e30e60646cdea5e4216.css 2.04 KiB 16 [emitted] [immutable] switch
switch.3e30e60646cdea5e4216.js 4.12 KiB 16 [emitted] [immutable] switch
task.8082a6cd3c389845ca0c.js 5.33 KiB 17 [emitted] [immutable] task
taskInstances.d758e4920a32ca069541.js 33.1 KiB 18, 3 [emitted] [immutable] taskInstances
tiLog.fc2c3580403a943ccddb.js 23.8 KiB 19 [emitted] [immutable] tiLog
tree.57c43dd706cbd3d74ef9.css 1.31 KiB 20, 3 [emitted] [immutable] tree
tree.57c43dd706cbd3d74ef9.js 1.48 MiB 20, 3 [emitted] [immutable] [big] tree
trigger.57a3ebbaee0f22bd5022.js 5.03 KiB 21 [emitted] [immutable] trigger
variableEdit.45c5312f076fbe019680.js 4.97 KiB 22 [emitted] [immutable] variableEdit
yaml-lint.js 1.23 KiB [emitted]
Entrypoint airflowDefaultTheme = airflowDefaultTheme.9ef6a9e2f0de25c0b346.css airflowDefaultTheme.9ef6a9e2f0de25c0b346.js
Entrypoint connectionForm = connectionForm.be3bf4692736d58cfdb0.js
Entrypoint dag = dag.c0b8852bb690f83bb55e.js
Entrypoint dagCode = dagCode.98dce599559f03115f1f.js
Entrypoint dagDependencies = dagDependencies.c2cdb377b2d3b7be7d1b.js
Entrypoint dags = dags.0ca53db014891875da7d.css dags.0ca53db014891875da7d.js
Entrypoint flash = flash.ab8296a74435427f9b53.css flash.ab8296a74435427f9b53.js
Entrypoint gantt = gantt.d7989000350b53dc0855.css gantt.d7989000350b53dc0855.js
Entrypoint graph = graph.eaba1e30424750441353.css graph.eaba1e30424750441353.js
Entrypoint ie = ie.fc8f40153cdecb7eb0b3.js
Entrypoint loadingDots = loadingDots.e4fbfc09969e91db1f49.css loadingDots.e4fbfc09969e91db1f49.js
Entrypoint main = main.216f001f0b6da7966a9f.css main.216f001f0b6da7966a9f.js
Entrypoint materialIcons = materialIcons.e368f72fd0a7e9a40455.css materialIcons.e368f72fd0a7e9a40455.js
Entrypoint moment [big] = moment.f2be510679d38b9c54e9.js
Entrypoint switch = switch.3e30e60646cdea5e4216.css switch.3e30e60646cdea5e4216.js
Entrypoint task = task.8082a6cd3c389845ca0c.js
Entrypoint taskInstances = taskInstances.d758e4920a32ca069541.js
Entrypoint tiLog = tiLog.fc2c3580403a943ccddb.js
Entrypoint tree [big] = tree.57c43dd706cbd3d74ef9.css tree.57c43dd706cbd3d74ef9.js
Entrypoint calendar = calendar.5260e8f126017610ad73.css calendar.5260e8f126017610ad73.js
Entrypoint durationChart = durationChart.ca520df04ff71dd5fab9.js
Entrypoint trigger = trigger.57a3ebbaee0f22bd5022.js
Entrypoint variableEdit = variableEdit.45c5312f076fbe019680.js
[8] ./static/js/dag.js 8.77 KiB {3} {6} {9} {10} {18} {20} [built]
[16] ./static/js/task_instances.js 4.54 KiB {6} {9} {10} {18} [built]
[69] ./static/css/bootstrap-theme.css 50 bytes {0} [built]
[70] ./static/js/connection_form.js 7.35 KiB {2} [built]
[71] ./static/js/dag_code.js 1.09 KiB {4} [built]
[72] ./static/js/dag_dependencies.js 6.39 KiB {5} [built]
[73] multi ./static/css/dags.css ./static/js/dags.js 40 bytes {6} [built]
[76] ./static/css/flash.css 50 bytes {8} [built]
[77] multi ./static/css/gantt.css ./static/js/gantt.js 40 bytes {9} [built]
[80] multi ./static/css/graph.css ./static/js/graph.js 40 bytes {10} [built]
[83] ./static/js/ie.js 887 bytes {11} [built]
[85] ./static/css/loading-dots.css 50 bytes {12} [built]
[86] multi ./static/css/main.css ./static/js/main.js 40 bytes {13} [built]
[88] ./static/css/material-icons.css 50 bytes {14} [built]
[93] ./static/css/switch.css 50 bytes {16} [built]
+ 425 hidden modules
WARNING in configuration
The 'mode' option has not been set, webpack will fallback to 'production' for this value. Set 'mode' option to 'development' or 'production' to enable defaults for each environment.
You can also set it to 'none' to disable any default behavior. Learn more: https://webpack.js.org/configuration/mode/
WARNING in asset size limit: The following asset(s) exceed the recommended size limit (244 KiB).
This can impact web performance.
Assets:
moment.f2be510679d38b9c54e9.js (377 KiB)
tree.57c43dd706cbd3d74ef9.js (1.48 MiB)
nv.d3.min.js (247 KiB)
nv.d3.min.js.map (1.86 MiB)
dagre-d3.min.js (708 KiB)
dagre-d3.min.js.map (653 KiB)
redoc.standalone.js (970 KiB)
redoc.standalone.js.map (3.23 MiB)
codemirror.js (389 KiB)
jshint.js (1.22 MiB)
WARNING in entrypoint size limit: The following entrypoint(s) combined asset size exceeds the recommended limit (244 KiB). This can impact web performance.
Entrypoints:
moment (377 KiB)
moment.f2be510679d38b9c54e9.js
tree (1.48 MiB)
tree.57c43dd706cbd3d74ef9.css
tree.57c43dd706cbd3d74ef9.js
WARNING in webpack performance recommendations:
You can limit the size of your bundles by using import() or require.ensure to lazy load some parts of your application.
For more info visit https://webpack.js.org/guides/code-splitting/
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/bootstrap-theme.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/bootstrap-theme.css 130 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/calendar.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/calendar.css 1.47 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/dags.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/dags.css 3.31 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/flash.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/flash.css 2.25 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/gantt.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/gantt.css 1.58 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/graph.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/graph.css 3.16 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/loading-dots.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/loading-dots.css 1.64 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/main.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/main.css 10.7 KiB {0} [built]
[3] ./static/sort_both.png 307 bytes {0} [built]
[4] ./static/sort_desc.png 251 bytes {0} [built]
[5] ./static/sort_asc.png 255 bytes {0} [built]
+ 2 hidden modules
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/material-icons.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/material-icons.css 110 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/switch.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/switch.css 2.69 KiB {0} [built]
+ 1 hidden module
Child mini-css-extract-plugin node_modules/css-loader/dist/cjs.js!static/css/tree.css:
Entrypoint mini-css-extract-plugin = *
[0] ./node_modules/css-loader/dist/cjs.js!./static/css/tree.css 1.8 KiB {0} [built]
+ 1 hidden module
✨ Done in 7.53s.
```
### What you expected to happen
_No response_
### How to reproduce
Checkout the `main` branch locally and run the project.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19647 | https://github.com/apache/airflow/pull/19661 | 7cda7d4b5e413925bf639976e77ebf2442b4bff9 | a81ae61ecfc4274780b571ff2f599f7c75875e14 | "2021-11-17T13:30:59Z" | python | "2021-11-17T19:02:48Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,641 | ["airflow/providers/amazon/aws/hooks/eks.py", "airflow/providers/amazon/aws/operators/eks.py", "tests/providers/amazon/aws/hooks/test_eks.py", "tests/providers/amazon/aws/operators/test_eks.py", "tests/providers/amazon/aws/utils/eks_test_constants.py", "tests/providers/amazon/aws/utils/eks_test_utils.py"] | EKSCreateNodegroupOperator - missing kwargs | ### Description
Boto3 / eks / create_nodegroup api supports the following kwargs:
```python
clusterName='string',
nodegroupName='string',
scalingConfig={
'minSize': 123,
'maxSize': 123,
'desiredSize': 123
},
diskSize=123,
subnets=[
'string',
],
instanceTypes=[
'string',
],
amiType='AL2_x86_64'|'AL2_x86_64_GPU'|'AL2_ARM_64'|'CUSTOM'|'BOTTLEROCKET_ARM_64'|'BOTTLEROCKET_x86_64',
remoteAccess={
'ec2SshKey': 'string',
'sourceSecurityGroups': [
'string',
]
},
nodeRole='string',
labels={
'string': 'string'
},
taints=[
{
'key': 'string',
'value': 'string',
'effect': 'NO_SCHEDULE'|'NO_EXECUTE'|'PREFER_NO_SCHEDULE'
},
],
tags={
'string': 'string'
},
clientRequestToken='string',
launchTemplate={
'name': 'string',
'version': 'string',
'id': 'string'
},
updateConfig={
'maxUnavailable': 123,
'maxUnavailablePercentage': 123
},
capacityType='ON_DEMAND'|'SPOT',
version='string',
releaseVersion='string'
```
The current implementation of the operator support the following kwargs only:
```python
clusterName='string',
nodegroupName='string',
subnets=[ 'string', ],
nodeRole='string',
```
### Use case/motivation
With the current implementation of the Operator you can bring up basic EKS nodegroup which is not useful. I want to fully configure my nodegroup with this operation like I can do it with boto3.
### Related issues
Continue of #19372
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19641 | https://github.com/apache/airflow/pull/20819 | 88814587d451be7493e005e4d477609a39caa1d9 | 307d35651998901b064b02a0748b1c6f96ae3ac0 | "2021-11-17T10:42:50Z" | python | "2022-01-14T17:05:09Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,618 | ["RELEASE_NOTES.rst"] | Execution_date not rendering after airflow upgrade | ### Apache Airflow version
2.2.2 (latest released)
### Operating System
Debian GNU/Linux
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.3.0
apache-airflow-providers-apache-spark==2.0.1
apache-airflow-providers-cncf-kubernetes==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Official Apache Airflow Helm Chart
### Deployment details
_No response_
### What happened
Hi,
We recently upgraded airflow from 2.1.0 to 2.2.2 (2.1.0 to 2.2.0 to 2.2.1 to 2.2.2) and DAGs aren't running as expected. All these DAGs were added before the upgrade itself and they were running fine.
We use execution_date parameter in SparkSubmitOperator which was rendering fine before the upgrade fails now returning None
"{{ (execution_date if execution_date.microsecond > 0 else dag.following_schedule(execution_date)).isoformat() }}"
DAG run fails with the error
jinja2.exceptions.UndefinedError: 'None' has no attribute 'isoformat'
Tried wiping out the database and ran as a fresh DAG but still same error
Any help would be appreciated
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19618 | https://github.com/apache/airflow/pull/24413 | dfdf8eb28f952bc42d8149043bcace9bfe882d76 | ce48567c54d0124840062b6bd86c2a745d3cc6d0 | "2021-11-16T14:53:15Z" | python | "2022-06-13T14:42:24Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,613 | ["airflow/www/views.py"] | Can't open task log from Gantt view | ### Apache Airflow version
2.2.1
### Operating System
Linux 5.4.149-73.259.amzn2.x86_64
### Versions of Apache Airflow Providers
default for 2.2.1
### Deployment
Other 3rd-party Helm chart
### Deployment details
aws eks using own-developed helm chart
### What happened
When trying to open log from gantt view - receiving an exception
```
File "/home/airflow/.local/lib/python3.9/site-packages/pendulum/parsing/__init__.py", line 177, in _parse_common
return date(year, month, day)
ValueError: year 0 is out of range
```
due to incorrect query parameter push: no value for `execution_date` pushed
```
/log?dag_id=report_generator_daily&task_id=check_quints_earnings&execution_date=
```
### What you expected to happen
Logs should be available
### How to reproduce
Open dag's `gantt` chart
click on task ribbon
click on `log`
observe an error
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19613 | https://github.com/apache/airflow/pull/20121 | b37c0efabd29b9f20ba05c0e1281de22809e0624 | f59decd391b75c509020e603e5857bb63ec891be | "2021-11-16T07:52:23Z" | python | "2021-12-08T16:11:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,581 | ["airflow/hooks/dbapi.py", "airflow/operators/generic_transfer.py", "airflow/providers/google/cloud/hooks/workflows.py", "airflow/providers/google/cloud/operators/workflows.py", "airflow/providers/postgres/hooks/postgres.py", "airflow/providers/sqlite/hooks/sqlite.py", "scripts/in_container/run_generate_constraints.sh"] | Miscellaneous documentation typos | ### Describe the issue with documentation
Recently starting our with Airflow I've been reading parts of the documentations quite carefully. There's at least two typos that could be fixed. First, looking at [Module management](https://airflow.apache.org/docs/apache-airflow/stable/modules_management.html) I see: "You can see the `.ariflowignore` file at the root of your folder." I'm rather confused by this, since when looking at [module_management.rst](https://github.com/apache/airflow/blob/main/docs/apache-airflow/modules_management.rst) it seems to say "You can see the `.airflowignore` file at the root of your folder" (no typo). Why's that?
Second, looking at the docs for [GenericTransfer](https://airflow.apache.org/docs/apache-airflow/stable/_api/airflow/operators/generic_transfer/index.html), the explanation for `destination_conn_id (str)` I believe should be e.g., `destination connection` and not `source connection` (compare with `source_conn_id (str)`). However, I'm unable to find the corresponding doc from the repo. When clicking on "Suggest a change on this page" I end up with 404.
### How to solve the problem
See above for the suggested fixes as well. I'd be happy to submit a PR too (but see some questions above as well).
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19581 | https://github.com/apache/airflow/pull/19599 | b9d31cd44962fc376fcf98380eaa1ea60fb6c835 | 355dec8fea5e2ef1a9b88363f201fce4f022fef3 | "2021-11-14T12:15:15Z" | python | "2021-11-17T18:12:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,569 | ["dev/README_RELEASE_AIRFLOW.md", "scripts/ci/tools/prepare_prod_docker_images.sh"] | The latest docker image is not the "latest" | ### Apache Airflow version
2.2.1 (latest released)
### Operating System
N/A
### Versions of Apache Airflow Providers
_No response_
### Deployment
Other Docker-based deployment
### Deployment details
_No response_
### What happened
The image tags with `latest` and `latest-python3.X` is either release two months ago or even 5 months ago.
https://hub.docker.com/r/apache/airflow/tags?name=latest
### What you expected to happen
According to the documentation here [1], seems it should be aligned with the latest stable version.
BTW, I am willing to submit a PR, but might need some hints how we manage the docker image tags.
[1] https://airflow.apache.org/docs/docker-stack/index.html
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19569 | https://github.com/apache/airflow/pull/19573 | 1453b959a614ab1ac045e61b9e5939def0ad9dff | 4a072725cbe63bff8f69b05dfb960134783ee17e | "2021-11-13T03:07:10Z" | python | "2021-11-15T21:34:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,490 | ["airflow/providers/databricks/hooks/databricks.py", "tests/providers/databricks/hooks/test_databricks.py"] | provider Databricks : add cluster API | ### Description
Add Databricks Cluster API in Databricks Hook
### Use case/motivation
Databricks provider lacks [Cluster API](https://docs.databricks.com/dev-tools/api/latest/clusters.html) access to allow
to CRUD cluster as well as manage clusters (start/restart/delete/...)
Those API are very convenient to control clusters (get status, force shutdown...) and to optimize some specific flows (scale cluster in advance when a coming need is detected )
### Related issues
none
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19490 | https://github.com/apache/airflow/pull/34643 | 6ba2c4485cb8ff2cf3c2e4d8043e4c7fe5008b15 | 946b539f0dbdc13272a44bdb6f756282f1d373e1 | "2021-11-09T13:33:30Z" | python | "2023-10-12T09:57:50Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,489 | ["airflow/providers/postgres/hooks/postgres.py", "docs/apache-airflow-providers-postgres/connections/postgres.rst", "tests/providers/postgres/hooks/test_postgres.py"] | PostgresSqlHook needs to override DbApiHook.get_uri to pull in extra for client_encoding=utf-8 during create_engine | ### Description
I got following error
```
[2021-11-09, 08:25:30 UTC] {base.py:70} INFO - Using connection to: id: rdb_conn_id. Host: rdb, Port: 5432, Schema: configuration, Login: user, Password: ***, extra: {'sslmode': 'allow', 'client_encoding': 'utf8'}
[2021-11-09, 08:25:30 UTC] {taskinstance.py:1703} ERROR - Task failed with exception
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1332, in _run_raw_task
self._execute_task_with_callbacks(context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1458, in _execute_task_with_callbacks
result = self._execute_task(context, self.task)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/models/taskinstance.py", line 1514, in _execute_task
result = execute_callable(context=context)
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/operators/python.py", line 151, in execute
return_value = self.execute_callable()
File "/home/airflow/.local/lib/python3.9/site-packages/airflow/operators/python.py", line 162, in execute_callable
return self.python_callable(*self.op_args, **self.op_kwargs)
File "/opt/airflow/dags/repo/dags/run_configuration.py", line 34, in run_job
dagsUtils.run_step_insert_to_temp_table(tenant, job_name, table_name, job_type)
File "/opt/airflow/dags/rev-e3db01f68e7979d71d12ae24008a97065db2144f/dags/utils/dag_util.py", line 106, in run_step_insert_to_temp_table
for df in df_result:
File "/home/airflow/.local/lib/python3.9/site-packages/pandas/io/sql.py", line 1499, in _query_iterator
data = result.fetchmany(chunksize)
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/engine/result.py", line 1316, in fetchmany
self.connection._handle_dbapi_exception(
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/engine/base.py", line 1514, in _handle_dbapi_exception
util.raise_(exc_info[1], with_traceback=exc_info[2])
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/engine/result.py", line 1311, in fetchmany
l = self.process_rows(self._fetchmany_impl(size))
File "/home/airflow/.local/lib/python3.9/site-packages/sqlalchemy/engine/result.py", line 1224, in _fetchmany_impl
return self.cursor.fetchmany(size)
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe2 in position 12: ordinal not in range(128)
```
Try to set `extra` in airflow connection but it does not work

### Use case/motivation
See that `airflow/providers/mysql/hooks` support getting `extra` configs from airflow connection https://github.com/apache/airflow/pull/6816 but not yet for postgresql hook
### Related issues
_No response_
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19489 | https://github.com/apache/airflow/pull/19827 | a192cecf6bb9b22e058b8c0015c351131185282b | c97a2e8ab84991bb08e811b9d5b6d5f95de150b2 | "2021-11-09T10:44:10Z" | python | "2021-11-26T20:29:20Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,477 | ["airflow/www/views.py"] | 'NoneType' object has no attribute 'refresh_from_task' Error when manually running task instance | ### Discussed in https://github.com/apache/airflow/discussions/19366
<div type='discussions-op-text'>
<sup>Originally posted by **a-pertsev** November 2, 2021</sup>
### Apache Airflow version
2.2.1 (latest released)
### Operating System
Ubuntu 20.04 LTS
### Versions of Apache Airflow Providers
apache-airflow-providers-amazon==2.3.0
apache-airflow-providers-apache-cassandra==2.1.0
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-google==6.0.0
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-jdbc==2.0.1
apache-airflow-providers-mysql==2.1.1
apache-airflow-providers-postgres==2.3.0
apache-airflow-providers-presto==2.0.1
apache-airflow-providers-slack==4.1.0
apache-airflow-providers-sqlite==2.0.1
### Deployment
Virtualenv installation
### Deployment details
_No response_
### What happened
Got error when manually run newly created task in already finished dag run (in UI).
### What you expected to happen
"Run" button should work without exceptions.
### How to reproduce
1. Define a dag with some tasks.
2. Create dag run (manually or by schedule)
3. Add new task into dag, deploy code
4. Select new task in UI and press "Run" button
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
</div> | https://github.com/apache/airflow/issues/19477 | https://github.com/apache/airflow/pull/19478 | 950a390770b04f8990f6cd962bc9c001124f1903 | 51d61a9ec2ee66a7f1b45901a2bb732786341cf4 | "2021-11-08T18:30:20Z" | python | "2021-11-08T19:42:13Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,461 | ["airflow/jobs/scheduler_job.py", "airflow/models/dag.py", "tests/jobs/test_scheduler_job.py"] | Missing DagRuns when catchup=True | ### Apache Airflow version
2.2.1 (latest released)
### Operating System
PRETTY_NAME="Debian GNU/Linux 10 (buster)
### Versions of Apache Airflow Providers
apache-airflow-providers-ftp==2.0.1
apache-airflow-providers-http==2.0.1
apache-airflow-providers-imap==2.0.1
apache-airflow-providers-sqlite==2.0.1
### Deployment
Docker-Compose
### Deployment details
_No response_
### What happened
Backfilling via catchup=True leads to missing DagRuns.
See reproduction steps for full details
### What you expected to happen
_No response_
### How to reproduce
Note, this is an issue which we have experienced in our production environment, with a much more complicated DAG. Below are the reproduction steps using breeze.
1. Setup ./breeze environment with the below config modifications
2. Create a simple DAG, with dummy tasks in it (see below example)
2. Set a `start_date` in the past
3. Set `catchup=True`
4. Unpause the DAG
5. Catch up starts and if you view the tree view, you have the false impression that everything has caught up correctly.
6. However, access the calendar view, you can see the missing DagRuns.
**Breeze Config**
```
export DB_RESET="true"
export START_AIRFLOW="true"
export INSTALL_AIRFLOW_VERSION="2.2.1"
export USE_AIRFLOW_VERSION="2.2.1"
```
**Dummy DAG**
```
from datetime import datetime
from airflow import DAG
from airflow.operators.dummy import DummyOperator
dag = DAG(
dag_id="temp_test",
schedule_interval="@daily",
catchup=True,
start_date=datetime(2021, 8, 1),
max_active_tasks=10,
max_active_runs=5,
is_paused_upon_creation=True,
)
with dag:
task1 = DummyOperator(task_id="task1")
task2 = DummyOperator(task_id="task2")
task3 = DummyOperator(task_id="task3")
task4 = DummyOperator(task_id="task4")
task5 = DummyOperator(task_id="task5")
task1 >> task2 >> task3 >> task4 >> task5
```
**Results**
<img width="1430" alt="tree_view" src="https://user-images.githubusercontent.com/10559757/140715465-6bc3831c-d71c-4025-bcde-985010ab31f8.png">
<img width="1435" alt="calendar_view" src="https://user-images.githubusercontent.com/10559757/140715467-1a1a5c9a-3eb6-40ff-8720-ebe6db999028.png">
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19461 | https://github.com/apache/airflow/pull/19528 | 8d63bdf0bb7a8d695cc00f10e4ebba37dea96af9 | 2bd4b55c53b593f2747a88f4c018d7e420460d9a | "2021-11-08T09:23:52Z" | python | "2021-11-11T09:44:11Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,458 | ["airflow/www/views.py"] | DAG Run Views showing information of DAG duration | ### Description
In Airflow UI. If we go to Browse --> DAG Runs, then DAG Runs view will be displayed.
Table contains a lot of insightful values likes DAG execution date, Start date, end date, Dag config etc
Request to add DAG duration column in the same table.
DAG Duration i.e. DAG end_date timestamp - DAG start_date timestamp
### Use case/motivation
Analytics purpose of knowing DAG duration at all DAGs level. So that I can easily find out which DAG have least and max duration.
### Related issues
Checked open issues. There aren't any
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19458 | https://github.com/apache/airflow/pull/19482 | 7640ba4e8ee239d6e2bbf950d53d624b9df93059 | b5c0158b2eb646eb1db5d2c094d3da8f88a08a8b | "2021-11-08T03:57:41Z" | python | "2021-11-29T14:47:02Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,432 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/dag_command.py", "docs/spelling_wordlist.txt", "tests/cli/commands/test_dag_command.py"] | Trigger Reserialization on Demand | ### Description
Dag serialization is currently out of the hands of the user. Whenever dag reserialization is required, I run this python script.
```
from airflow.models.serialized_dag import SerializedDagModel
from airflow.settings import Session
session = Session()
session.query(SerializedDagModel).delete()
session.commit()
```
It would be great to have this available from the CLI and/or UI. Or even be able to do it for individual dags without having to delete the whole dag with the trash can button. Also would be worth considering triggering reserialization for any version upgrades that affect dag serialization as part of `airflow db upgrade`.
### Use case/motivation
If your serialized dags get broken for any reason, like the incompatibilities with the changes made in 2.2 recently, reserializing will fix the issue, but it's fairly difficult to trigger reserializing.
### Related issues
_No response_
### Are you willing to submit a PR?
- [x] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19432 | https://github.com/apache/airflow/pull/19471 | 7d555d779dc83566d814a36946bd886c2e7468b3 | c4e8959d141512226a994baeea74d5c7e643c598 | "2021-11-05T19:25:37Z" | python | "2021-11-29T16:54:14Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,384 | ["airflow/providers/apache/livy/hooks/livy.py", "airflow/providers/apache/livy/operators/livy.py", "tests/providers/apache/livy/operators/test_livy.py"] | Add retries to LivyOperator polling / LivyHook | ### Description
Add an optional retry loop to LivyOperator.poll_for_termination() or LivyHook.get_batch_state() to improve resiliency against temporary errors. The retry counter should reset with successful requests.
### Use case/motivation
1. Using LivyOperator, we run a Spark Streaming job in a cluster behind Knox with LDAP authentication.
2. While the streaming job is running, LivyOperator keeps polling for termination.
3. In our case, the LDAP service might be unavailable for a few of the polling requests per day, resulting in Knox returning an error.
4. LivyOperator marks the task as failed even though the streaming job should still be running, as subsequent polling requests might have revealed.
5. We would like LivyOperator/LivyHook to send a number of retries in order to overcome those brief availability issues.
Workarounds we considered:
- increase polling interval to reduce the chance of running into an error. For reference, we are currently using an interval of 10s
- use BaseOperator retries to start a new job, only send notification email for the final failure. But this would start a new job unnecessarily
- activate knox authentication caching to decrease the chance of errors substantially, but it was causing issues not related to Airflow
### Related issues
No related issues were found
### Are you willing to submit a PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19384 | https://github.com/apache/airflow/pull/21550 | f6e0ed0dcc492636f6d1a3a413d5df2f9758f80d | 5fdd6fa4796bd52b3ce52ef8c3280999f4e2bb1c | "2021-11-03T15:04:56Z" | python | "2022-02-15T21:59:56Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,342 | ["airflow/www/static/js/ti_log.js"] | Webserver shows wrong datetime (timezone) in log | ### Apache Airflow version
2.2.1 (latest released)
### Operating System
Ubuntu 20.04.2 (docker)
### Versions of Apache Airflow Providers
- apache-airflow-providers-amazon==2.3.0
- apache-airflow-providers-ftp==2.0.1
- apache-airflow-providers-http==2.0.1
- apache-airflow-providers-imap==2.0.1
- apache-airflow-providers-mongo==2.1.0
- apache-airflow-providers-postgres==2.3.0
- apache-airflow-providers-sqlite==2.0.1
### Deployment
Docker-Compose
### Deployment details
Docker image build on Ubuntu 20.04 -> installed apache airflow via pip.
Localtime in image changed to Europe/Moscow.
Log format ariflow.cfg option:
log_format = %%(asctime)s %%(filename)s:%%(lineno)d %%(levelname)s - %%(message)s
### What happened
For my purposes it's more usefull to run dags when it's midnight in my timezone.
So I changed _default_timezone_ option in airflow.cfg to "Europe/Moscow" and also changed /etc/localtime in my docker image.
It works nice:
- dags with _@daily_ schedule_interval runs at midnight
- python`s datetime.now() get me my localtime by default
- airflow webserver shows all time correctly when I change timezone in right top corner
... except one thing.
Python logging module saves asctime without timezone (for example "2021-10-31 18:25:42,550").
And when I open task`s log in web interface, it shifts this time forward by three hours (for my timzone), but it's **already** in my timzone.
It is a little bit confusing :(
### What you expected to happen
I expected to see my timezone in logs :)
I see several solutions for that:
1) any possibility to turn that shift off?
2) setup logging timezone in airflow.cfg?
That problem is gone when I change system (in container) /etc/localtime to UTC.
But this is very problematic because of the ability to affect a lot of python tasks.
### How to reproduce
1) build docker container with different /etc/localtime
> FROM ubuntu:20.04
>
> ARG DEBIAN_FRONTEND=noninteractive
> RUN apt-get update && apt-get install -y apt-utils locales tzdata \
> && locale-gen en_US.UTF-8 \
> && ln -sf /usr/share/zoneinfo/Europe/Moscow /etc/localtime
> ENV LANG=en_US.UTF-8 LANGUAGE=en_US:en AIRFLOW_GPL_UNIDECODE=yes
>
> RUN apt-get install -y \
> python3-pip \
> && python3 -m pip install --upgrade pip setuptools wheel \
> && pip3 install --no-cache-dir \
> apache-airflow-providers-amazon \
> apache-airflow-providers-mongo \
> apache-airflow-providers-postgres \
> apache-airflow==2.2.1 \
> celery \
> ... anything else
2) run webserver / scheduler / celery worker inside
3) open web page -> trigger dag with python operator which prints something via logging
4) open done dag -> task log -> find asctime mark in log

5) switch timezone in web interface
6) watch how airflow thinks that asctime in log in UTC, but it's not

### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19342 | https://github.com/apache/airflow/pull/19401 | 35f3bf8fb447be0ed581cc317897180b541c63ce | c96789b85cf59ece65f055e158f9559bb1d18faa | "2021-10-31T16:08:56Z" | python | "2021-11-05T16:04:58Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,272 | ["airflow/providers/amazon/aws/hooks/glue_catalog.py", "tests/providers/amazon/aws/hooks/test_glue_catalog.py"] | Create get_partition, create_partition methods in AWS Glue Catalog Hook | ### Description
Current AWS Glue Catalog Hook does not have all the methods.
If possible, we can add get_partition and create_partition methods in existing Glue Catalog Hook.
### Use case/motivation
I do work in Amazon.com. Our team is using Airflow to orchestrate Data Builds for our services.
In process of data builds, we need to create new Glue partitions for generated data and also retrieve particular Glue partition. Existing Glue Catalog Hook does not contain all the available methods.
### Related issues
I was not able to find any related existing issues or feature request.
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19272 | https://github.com/apache/airflow/pull/23857 | 8f3a9b8542346c35712cba373baaafb518503562 | 94f2ce9342d995f1d2eb00e6a9444e57c90e4963 | "2021-10-28T03:18:42Z" | python | "2022-05-30T19:26:40Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,269 | ["airflow/providers/facebook/ads/hooks/ads.py", "airflow/providers/google/cloud/transfers/facebook_ads_to_gcs.py", "tests/providers/facebook/ads/hooks/test_ads.py", "tests/providers/google/cloud/transfers/test_facebook_ads_to_gcs.py"] | Add support for multiple AdAccount when getting Bulk Facebook Report from Facebook Ads Hook | ### Description
Currently, the Facebook Ads Hook only supports one `account_id` for every hook(`facebook_conn_id`). It would be great if we could pass multiple `account_id` from the Facebook connection(`facebook_conn_id`). When multiple `account_id` is provided in the connection, we could support getting insights from one `facebook_conn_id` for all of them rather than creating a Facebook connection for every `account_id`.
### Use case/motivation
We are using FacebookAdsReportToGcsOperator to export our Facebook Ad data since our infrastructure is on the Google Cloud Platform products and services. This feature will prevent us to create and maintain a connection per account id which will lead us to provide a feature to export multiple CSV files from one connection(one task in a DAG|one operator in a DAG) with one operator. In addition, this feature should not prevent adding multiple Facebook connections per account id to use as it is.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19269 | https://github.com/apache/airflow/pull/19377 | 704ec82404dea0001e67a74596d82e138ef0b7ca | ed8b63ba2460f47744f4dcf40019592816bb89b5 | "2021-10-28T01:33:54Z" | python | "2021-12-08T09:13:26Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,260 | ["airflow/cli/commands/triggerer_command.py"] | Airflow standalone command does not exit gracefully | ### Apache Airflow version
2.2.0 (latest released)
### Operating System
macOS
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
_No response_
### What happened
run `airflow standalone`
enter `ctrl+c`
hangs here:
```
webserver | 127.0.0.1 - - [27/Oct/2021:09:30:57 -0700] "GET /static/pin_32.png HTTP/1.1" 304 0 "http://localhost:8080/home" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/95.0.4638.54 Safari/537.36"
^Cstandalone | Shutting down components
```
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19260 | https://github.com/apache/airflow/pull/23274 | 6cf0176f2a676008a6fbe5b950ab2e3231fd1f76 | 6bdbed6c43df3c5473b168a75c50e0139cc13e88 | "2021-10-27T16:34:33Z" | python | "2022-04-27T16:18:19Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,252 | ["airflow/www/views.py"] | Task modal links are broken in the dag gantt view | ### Apache Airflow version
2.2.0 (latest released)
### Operating System
Debian GNU/Linux 11 (bullseye)
### Versions of Apache Airflow Providers
N/A
### Deployment
Other Docker-based deployment
### Deployment details
CeleryExecutor / ECS / Postgres
### What happened

Clicking on logs / instance details on the following dialog causes an exception:
```
Traceback (most recent call last):
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/usr/local/lib/python3.9/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/usr/local/lib/python3.9/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/usr/local/lib/python3.9/site-packages/airflow/www/auth.py", line 51, in decorated
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/www/decorators.py", line 63, in wrapper
log.execution_date = pendulum.parse(execution_date_value, strict=False)
File "/usr/local/lib/python3.9/site-packages/pendulum/parser.py", line 29, in parse
return _parse(text, **options)
File "/usr/local/lib/python3.9/site-packages/pendulum/parser.py", line 45, in _parse
parsed = base_parse(text, **options)
File "/usr/local/lib/python3.9/site-packages/pendulum/parsing/__init__.py", line 74, in parse
return _normalize(_parse(text, **_options), **_options)
File "/usr/local/lib/python3.9/site-packages/pendulum/parsing/__init__.py", line 120, in _parse
return _parse_common(text, **options)
File "/usr/local/lib/python3.9/site-packages/pendulum/parsing/__init__.py", line 177, in _parse_common
return date(year, month, day)
ValueError: year 0 is out of range
```
This is because the execution_date in the query param of the url is empty i.e:
`http://localhost:50008/log?dag_id=test_logging&task_id=check_exception_to_sentry&execution_date=`
### What you expected to happen
The logs to load / task instance detail page to load
### How to reproduce
See above
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19252 | https://github.com/apache/airflow/pull/19258 | e0aa36ead4bb703abf5702bb1c9b105d60c15b28 | aa6c951988123edc84212d98b5a2abad9bd669f9 | "2021-10-27T13:32:24Z" | python | "2021-10-29T06:29:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,241 | ["airflow/macros/__init__.py", "tests/macros/test_macros.py"] | next_ds changed to proxy and it cannot be used in ds_add macro function | ### Apache Airflow version
2.2.0 (latest released)
### Operating System
Ubuntu
### Versions of Apache Airflow Providers
_No response_
### Deployment
Docker-Compose
### Deployment details
_No response_
### What happened
Tried to use this this code:
`some_variable='{{macros.ds_format(macros.ds_add(next_ds, '
'(ti.start_date - ti.execution_date).days), '
'"%Y-%m-%d", "%Y-%m-%d 21:00:00")}}')`
but got this error:
`strptime() argument 1 must be str, not Proxy`
because the `next_ds` variable changed to proxy.
### What you expected to happen
_No response_
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19241 | https://github.com/apache/airflow/pull/19592 | 0da54f1dbe65f55316d238308155103f820192a8 | fca2b19a5e0c081ab323479e76551d66ab478d07 | "2021-10-27T06:34:57Z" | python | "2021-11-24T23:12:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,223 | ["airflow/providers/mongo/sensors/mongo.py", "tests/providers/mongo/sensors/test_mongo.py"] | Add mongo_db to MongoSensor Operator | ### Description
[MongoSensor](https://airflow.apache.org/docs/apache-airflow-providers-mongo/2.1.0/_api/airflow/providers/mongo/sensors/mongo/index.html) does not have a DB in the Python API. It has to use the schema from Mongo Connection.
### Use case/motivation
In mongo, the schema is usually the Auth DB and not the data DB. The user will not have the option to switch DB at the DAG level.
### Related issues
_No response_
### Are you willing to submit a PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19223 | https://github.com/apache/airflow/pull/19276 | c955078b22ad797a48baf690693454c9b8ba516d | fd569e714403176770b26cf595632812bd384bc0 | "2021-10-26T11:19:08Z" | python | "2021-10-28T09:32:25Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,222 | ["airflow/models/dag.py", "tests/jobs/test_local_task_job.py"] | none_failed_min_one_success trigger rule not working with BranchPythonOperator in certain cases. | ### Apache Airflow version
2.2.0 (latest released)
### Operating System
CentOS Linux 7 (Core)
### Versions of Apache Airflow Providers
apache-airflow-providers-ftp 2.0.1
apache-airflow-providers-http 2.0.1
apache-airflow-providers-imap 2.0.1
apache-airflow-providers-sqlite 2.0.1
### Deployment
Other
### Deployment details
centos 7.3
postgres - 12.2
### What happened
### DAG 1
In this DAG, I am expecting task_6 to run even when one of task_4 or task_5 get skipped. But, as soon as task_5 skips, task_6 also gets skipped. Ideally, task_6 should have waited for task_4 to finish, then only it should have taken the decision whether to run or skip.
```
import datetime as dt
import time
from airflow import DAG
from airflow.operators.python_operator import BranchPythonOperator
from airflow.operators.dummy_operator import DummyOperator
dag = DAG(
dag_id="test_non_failed_min_one_success",
schedule_interval="@once",
start_date=dt.datetime(2019, 2, 28),
)
def sleep(seconds, return_val=None):
time.sleep(seconds)
return return_val
op1 = DummyOperator(task_id="task_1", dag=dag)
op2 = BranchPythonOperator(
task_id="task_2", python_callable=sleep, op_args=[30, ["task_4"]], dag=dag
)
op3 = BranchPythonOperator(task_id="task_3", python_callable=sleep, op_args=[10, []], dag=dag)
op4 = DummyOperator(task_id="task_4", dag=dag)
op5 = DummyOperator(task_id="task_5", dag=dag)
op6 = DummyOperator(task_id="task_6", dag=dag, trigger_rule="none_failed_min_one_success")
op1 >> [op2, op3]
op2 >> op4
op3 >> op5
[op4, op5] >> op6
```


### DAG 2
This is just a modification of DAG 1 where I have created two more dummy tasks between task_5 and task_6. Now, I get the desired behaviour, i.e. task_6 waits for both task_4 and dummy_2 to finish before taking the decision of whether to run or skip.
```
import datetime as dt
import time
from airflow import DAG
from airflow.operators.python_operator import BranchPythonOperator
from airflow.operators.dummy_operator import DummyOperator
dag = DAG(
dag_id="test_non_failed_min_one_success",
schedule_interval="@once",
start_date=dt.datetime(2019, 2, 28),
)
def sleep(seconds, return_val=None):
time.sleep(seconds)
return return_val
op1 = DummyOperator(task_id="task_1", dag=dag)
op2 = BranchPythonOperator(
task_id="task_2", python_callable=sleep, op_args=[30, ["task_4"]], dag=dag
)
op3 = BranchPythonOperator(task_id="task_3", python_callable=sleep, op_args=[10, []], dag=dag)
op4 = DummyOperator(task_id="task_4", dag=dag)
op5 = DummyOperator(task_id="task_5", dag=dag)
dummy1 = DummyOperator(task_id="dummy1", dag=dag)
dummy2 = DummyOperator(task_id="dummy2", dag=dag)
op6 = DummyOperator(task_id="task_6", dag=dag, trigger_rule="none_failed_min_one_success")
op1 >> [op2, op3]
op2 >> op4
op3 >> op5
op5 >> dummy1 >> dummy2
[op4, dummy2] >> op6
```


### What you expected to happen
I expected task_6 in DAG 1 to wait for both the parent tasks to finish their runs and then run. This is because I have set trigger rule = "none_failed_min_one_success" in task_6. But, this trigger rule is not being respected in DAG 1.
### How to reproduce
The above code can be used to reproduce the issue.
### Anything else
This works fine in version 2.0.2 with trigger rule = "none_failed_or_skipped".
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19222 | https://github.com/apache/airflow/pull/23181 | 2bb1cd2fec4e2069cb4bbb42d1a880e905d9468e | b4c88f8e44e61a92408ec2cb0a5490eeaf2f0dba | "2021-10-26T10:31:46Z" | python | "2022-04-26T10:53:45Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,206 | ["airflow/www/app.py"] | Webserver uses /tmp and does not respect user-configured tempdir path | ### Apache Airflow version
2.2.0 (latest released)
### Operating System
Red Hat Enterprise Linux Server 7.6 (Maipo)
### Versions of Apache Airflow Providers
_No response_
### Deployment
Virtualenv installation
### Deployment details
For this usecase - I am testing with a non-root user on a machine where the user does not have write access to /tmp.
The `TMPDIR` environment variable is set to a different directory.
### What happened
When running webserver, errors like this are shown and slow down the webserver.
```
[2021-10-25 13:46:51,164] {filesystemcache.py:224} ERROR - set key '\x1b[01m__wz_cache_count\x1b[22m' -> [Errno 1] Operation not permitted: '/tmp/tmpbw3h3p93.__wz_cache' -> '/tmp/2029240f6d1128be89ddc32729463129'
```
### What you expected to happen
Airflow should respect the `TMPDIR` environment variable and use a different temp dir
### How to reproduce
_No response_
### Anything else
_No response_
### Are you willing to submit PR?
- [X] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19206 | https://github.com/apache/airflow/pull/19208 | 726a1517ec368e0f5906368350d6fa96836943ae | 77c92687e613c5648303acd7cebfb89fa364fc94 | "2021-10-25T17:49:10Z" | python | "2021-10-26T16:11:35Z" |
closed | apache/airflow | https://github.com/apache/airflow | 19,204 | ["UPDATING.md"] | TaskInstance.execution_date usage in query creates incorrect SQL syntax | ### Apache Airflow version
2.2.0 (latest released)
### Operating System
Red Hat Enterprise Linux Server 7.6 (Maipo)
### Versions of Apache Airflow Providers
apache-airflow-providers-postgres==2.3.0
### Deployment
Virtualenv installation
### Deployment details
_No response_
### What happened
`psycopg2.errors.SyntaxError` raised when using a custom "external task sensor" with the following code:
```python
TI = TaskInstance
DR = DagRun
if self.external_task_id:
last_instance = session.query(TI).filter(
TI.dag_id == self.external_dag_id,
TI.task_id == self.external_task_id,
TI.execution_date.in_(dttm_filter)
).order_by(TI.execution_date.desc()).first()
else:
last_instance = session.query(DR).filter(
DR.dag_id == self.dag_id,
DR.execution_date.in_(dttm_filter)
).order_by(DR.execution_date.desc()).first()
return last_instance.state
```
This code was a modified from https://github.com/apache/airflow/blob/2.2.0/airflow/sensors/external_task.py#L231 and worked on airflow 1.10.7 - 2.1.4.
Error details:
```
sqlalchemy.exc.ProgrammingError: (psycopg2.errors.SyntaxError) syntax error at or near "DESC"
LINE 7: ..._id = task_instance.run_id AND dag_run.execution_date DESC)
^
[SQL: SELECT task_instance.try_number AS task_instance_try_number, task_instance.task_id AS task_instance_task_id, task_instance.dag_id AS task_instance_dag_id, task_instance.run_id AS task_instance_run_id, task_instance.start_date AS task_instance_start_date, task_instance.end_date AS task_instance_end_date, task_instance.duration AS task_instance_duration, task_instance.state AS task_instance_state, task_instance.max_tries AS task_instance_max_tries, task_instance.hostname AS task_instance_hostname, task_instance.unixname AS task_instance_unixname, task_instance.job_id AS task_instance_job_id, task_instance.pool AS task_instance_pool, task_instance.pool_slots AS task_instance_pool_slots, task_instance.queue AS task_instance_queue, task_instance.priority_weight AS task_instance_priority_weight, task_instance.operator AS task_instance_operator, task_instance.queued_dttm AS task_instance_queued_dttm, task_instance.queued_by_job_id AS task_instance_queued_by_job_id, task_instance.pid AS task_instance_pid, task_instance.executor_config AS task_instance_executor_config, task_instance.external_executor_id AS task_instance_external_executor_id, task_instance.trigger_id AS task_instance_trigger_id, task_instance.trigger_timeout AS task_instance_trigger_timeout, task_instance.next_method AS task_instance_next_method, task_instance.next_kwargs AS task_instance_next_kwargs, dag_run_1.state AS dag_run_1_state, dag_run_1.id AS dag_run_1_id, dag_run_1.dag_id AS dag_run_1_dag_id, dag_run_1.queued_at AS dag_run_1_queued_at, dag_run_1.execution_date AS dag_run_1_execution_date, dag_run_1.start_date AS dag_run_1_start_date, dag_run_1.end_date AS dag_run_1_end_date, dag_run_1.run_id AS dag_run_1_run_id, dag_run_1.creating_job_id AS dag_run_1_creating_job_id, dag_run_1.external_trigger AS dag_run_1_external_trigger, dag_run_1.run_type AS dag_run_1_run_type, dag_run_1.conf AS dag_run_1_conf, dag_run_1.data_interval_start AS dag_run_1_data_interval_start, dag_run_1.data_interval_end AS dag_run_1_data_interval_end, dag_run_1.last_scheduling_decision AS dag_run_1_last_scheduling_decision, dag_run_1.dag_hash AS dag_run_1_dag_hash
FROM task_instance JOIN dag_run AS dag_run_1 ON dag_run_1.dag_id = task_instance.dag_id AND dag_run_1.run_id = task_instance.run_id
WHERE task_instance.dag_id = %(dag_id_1)s AND task_instance.task_id = %(task_id_1)s AND (EXISTS (SELECT 1
FROM dag_run
WHERE dag_run.dag_id = task_instance.dag_id AND dag_run.run_id = task_instance.run_id AND dag_run.execution_date IN (%(execution_date_1)s))) ORDER BY EXISTS (SELECT 1
FROM dag_run
WHERE dag_run.dag_id = task_instance.dag_id AND dag_run.run_id = task_instance.run_id AND dag_run.execution_date DESC)
LIMIT %(param_1)s]
```
### What you expected to happen
Either the query should work (best), or `TI.execution_date` should be more strictly-checked and documented to only be usable for some actions.
Although a deprecation waring is raised when accessing `TI.execution_date`, the ExternalTaskSensor uses it and it is a part of the model of TaskInstance.
### How to reproduce
Reproduces consistently.
See code in "what happened" can provide full sensor code if needed.
### Anything else
_No response_
### Are you willing to submit PR?
- [ ] Yes I am willing to submit a PR!
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/apache/airflow/blob/main/CODE_OF_CONDUCT.md)
| https://github.com/apache/airflow/issues/19204 | https://github.com/apache/airflow/pull/19593 | 48f228cf9ef7602df9bea6ce20d663ac0c4393e1 | 1ee65bb8ae9f98233208ebb7918cf9aa1e01823e | "2021-10-25T17:27:56Z" | python | "2021-11-15T21:41:43Z" |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.