status
stringclasses 1
value | repo_name
stringclasses 13
values | repo_url
stringclasses 13
values | issue_id
int64 1
104k
| updated_files
stringlengths 11
1.76k
| title
stringlengths 4
369
| body
stringlengths 0
254k
⌀ | issue_url
stringlengths 38
55
| pull_url
stringlengths 38
53
| before_fix_sha
stringlengths 40
40
| after_fix_sha
stringlengths 40
40
| report_datetime
unknown | language
stringclasses 5
values | commit_datetime
unknown |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|
closed | apache/airflow | https://github.com/apache/airflow | 13,750 | ["airflow/sensors/sql.py", "tests/sensors/test_sql_sensor.py"] | Support Standard SQL in BigQuery Sensor | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
A sql sensor which uses Standard SQL due to default one uses legacy sql
**Use case / motivation**
Currently (correct me if I am wrong!), the sql sensor only supports legacy sql. If I want to poke a BQ table, I do not think I can do that using standard sql right now.
**Are you willing to submit a PR?**
If community approves of this idea, sure!
| https://github.com/apache/airflow/issues/13750 | https://github.com/apache/airflow/pull/18431 | 83b51e53062dc596a630edd4bd01407a556f1aa6 | 314a4fe0050783ebb43b300c4c950667d1ddaa89 | "2021-01-18T19:35:41Z" | python | "2021-11-26T15:04:23Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,744 | ["airflow/api_connexion/endpoints/connection_endpoint.py", "tests/api_connexion/endpoints/test_connection_endpoint.py"] | REST API Connection Endpoint doesn't return the extra field in response | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
Apache Airflow: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
Distributor ID: Ubuntu
Description: Ubuntu 18.04.5 LTS
Release: 18.04
Codename: bionic
- **Kernel** (e.g. `uname -a`):
Linux Personal 5.4.0-62-generic #70~18.04.1-Ubuntu SMP Tue Jan 12 17:18:00 UTC 2021 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
REST API doesn't return the **extra** field of the connection in the response.
**What you expected to happen**:
<!-- What do you think went wrong? -->
It should return all the fields as shown in the documentation.

**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
Create one connection with id **leads_ec2** and define values as shown in the screenshot.

Now call the below API endpoint to get the connection details. And as shown in the screenshot it doesn't include the extra field in the response.
**API Endpoint** : `http://localhost:8000/api/v1/connections/leads_ec2`

**How often does this problem occur? Once? Every time etc?**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
Same for other connection_id. It doesn't return the extra field in the response.
| https://github.com/apache/airflow/issues/13744 | https://github.com/apache/airflow/pull/13885 | 31b956c6c22476d109c45c99d8a325c5c1e0fd45 | adf7755eaa67bd924f6a4da0498bce804da1dd4b | "2021-01-18T14:42:08Z" | python | "2021-01-25T09:52:16Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,741 | ["airflow/stats.py", "tests/core/test_stats.py"] | Airflow 2.0 does not send metrics to statsD when Scheduler is run with Daemon mode |
**Apache Airflow version**:
2.0.0
**Environment**:
- **OS** (e.g. from /etc/os-release): Ubuntu 20.04 LTS
- **Python version**: 3.8
- **Kernel** (e.g. `uname -a`): x86_64 x86_64 x86_64 GNU/Linux 5.4.0-58-generic #64-Ubuntu
- **Install tools**: pip
**What happened**:
Airflow 2.0 does not send metrics to statsD.
I configure Airflow with official documentation (https://airflow.apache.org/docs/apache-airflow/stable/logging-monitoring/metrics.html) and by this article https://dstan.medium.com/run-airflow-statsd-grafana-locally-16b372c86524 (but I used port 8125).
I turned on statsD:
```ini
statsd_on = True
statsd_host = localhost
statsd_port = 8125
statsd_prefix = airflow
```
But I do not see airflow metrics at http://localhost:9102/metrics (statsD metrics endpoint).
---
P.S. I noticed this error just using Airflow 2.0. In version 1.10.13 everything is ok in the same environment.
Thank you for advance.
| https://github.com/apache/airflow/issues/13741 | https://github.com/apache/airflow/pull/14454 | cfa1071eaf0672dbf2b2825c3fd6affaca68bdee | 0aa597e2ffd71d3587f629c0a1cb3d904e07b6e6 | "2021-01-18T12:26:52Z" | python | "2021-02-26T14:45:56Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,713 | ["airflow/www/static/css/main.css"] | Airflow web server UI bouncing horizontally at some viewport widths | **Apache Airflow version**: 2.0.0
**Environment**: Ubuntu 20.04 LTS, Python 3.8.6 via pyenv
- **OS** (e.g. from /etc/os-release): 20.04.1 LTS (Focal Fossa)
- **Kernel** (e.g. `uname -a`): Linux DESKTOP-QBFDUA0 4.19.104-microsoft-standard #1 SMP Wed Feb 19 06:37:35 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**: Following steps in https://airflow.apache.org/docs/apache-airflow/stable/start.html
**What happened**:
I followed the quickstart here (https://airflow.apache.org/docs/apache-airflow/stable/start.html) to start Airflow on my machine. Then, I followed the tutorial here (https://airflow.apache.org/docs/apache-airflow/stable/tutorial.html) to create my own DAG after disabling the example DAGs via the config file. The bouncing problem I'm reporting I actually noticed as soon as I launched Airflow. I'm just explaining what steps I took to get to what you see in the GIF below.
When I opened the Airflow UI in my browser, it appeared to "bounce" left and right. This happened on multiple pages. It seemed to happen only at certain widths bigger than the mobile width. At a large width, it didn't happen. I captured a GIF to try to demonstrate it:

I didn't see any JS errors in the console in dev tools as this was happening.
**What you expected to happen**: A bounce-free **Airflow experience**™️
**What do you think went wrong?**: CSS? I'm not qualified for this magical front end stuff tbh.
**How to reproduce it**: Run the steps I described above on Ubuntu 20.04 LTS or a similar Linux operating system, using Python 3.
**Anything else we need to know**: n/a
**How often does this problem occur? Once? Every time etc?**
Every time I launch the Airflow web server and scheduler and load it at `localhost:8080`.
| https://github.com/apache/airflow/issues/13713 | https://github.com/apache/airflow/pull/13857 | b9eb51a0fb32cd660a5459d73d7323865b34dd99 | f72be51aeca5edb5696a9feb2acb4ff8f6bcc658 | "2021-01-16T03:43:42Z" | python | "2021-01-25T22:03:26Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,700 | ["airflow/models/dag.py", "tests/models/test_dag.py"] | Partial subset DAGs do not update task group's used_group_ids | **Apache Airflow version**: 2.0.0
**Environment**:
- **Cloud provider or hardware configuration**: Docker container
- **OS** (e.g. from /etc/os-release): Debian Stretch
**What happened**:
When working on some custom DAG override logic, I noticed that invoking `DAG.partial_subset` does not properly update the corresponding `_task_group.used_group_ids` on the returned subset DAG, such that adding back a task which was excluded during the `partial_subset` operation fails.
**What you expected to happen**:
Tasks that had already been added to the original DAG can be added again to the subset DAG returned by `DAG.partial_subset`
**How to reproduce it**:
Create any DAG with a single task called, e.g. `my-task`, then invoke `dag.partial_subset(['not-my-task'], False, False)`
Note that the returned subset DAG's `_task_group.used_group_ids` still contains `my-task` even though it was not included in the subset DAG itself
**Anything else we need to know**:
I was able to work around this by adding logic to update the new partial subset DAG's `_task_group.used_group_ids` manually, but this should really be done as part of the `DAG.partial_subset` logic | https://github.com/apache/airflow/issues/13700 | https://github.com/apache/airflow/pull/15308 | 42a1ca8aab905a0eb1ffb3da30cef9c76830abff | 1e425fe6459a39d93a9ada64278c35f7cf0eab06 | "2021-01-15T14:47:54Z" | python | "2021-04-20T18:08:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,697 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"] | Email config section is incorrect |
**Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): n/a
**Environment**: This pertains to the docs
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
I see [here](https://airflow.apache.org/docs/apache-airflow/stable/howto/email-config.html#email-configuration) it says to set `subject_template` and `html_content_template` under the email header, but in the [configuration references](https://airflow.apache.org/docs/apache-airflow/stable/configurations-ref.html#email) it doesn't show those two fields. Have they been removed for some reason?
| https://github.com/apache/airflow/issues/13697 | https://github.com/apache/airflow/pull/13709 | 74b2cd7364df192a8b53d4734e33b07e69864acc | 1ab19b40fdea3d6399fcab4cd8855813e0d232cf | "2021-01-15T14:02:02Z" | python | "2021-01-16T01:11:35Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,680 | ["chart/files/pod-template-file.kubernetes-helm-yaml"] | "dag_id could not be found" when running airflow on KubernetesExecutor | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): v1.19.4
**What happened**:
I get this error when try to execute tasks using kubernetes
```
[2021-01-14 19:39:17,628] {dagbag.py:440} INFO - Filling up the DagBag from /opt/airflow/dags/repo/bash.py
Traceback (most recent call last):
File "/home/airflow/.local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 89, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/cli/commands/task_command.py", line 216, in task_run
dag = get_dag(args.subdir, args.dag_id)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/cli.py", line 189, in get_dag
'parse.'.format(dag_id)
airflow.exceptions.AirflowException: dag_id could not be found: bash. Either the dag did not exist or it failed to parse.
```
**What you expected to happen**:
get executed and terminate
**How to reproduce it**:
deploy airflow helm chart using this values.yaml:
```
# Licensed to the Apache Software Foundation (ASF) under one
# or more contributor license agreements. See the NOTICE file
# distributed with this work for additional information
# regarding copyright ownership. The ASF licenses this file
# to you under the Apache License, Version 2.0 (the
# "License"); you may not use this file except in compliance
# with the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing,
# software distributed under the License is distributed on an
# "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
# KIND, either express or implied. See the License for the
# specific language governing permissions and limitations
# under the License.
---
# Default values for airflow.
# This is a YAML-formatted file.
# Declare variables to be passed into your templates.
# User and group of airflow user
uid: 50000
gid: 50000
# Airflow home directory
# Used for mount paths
airflowHome: "/opt/airflow"
# Default airflow repository -- overrides all the specific images below
defaultAirflowRepository: apache/airflow
# Default airflow tag to deploy
defaultAirflowTag: 2.0.0
# Select certain nodes for airflow pods.
nodeSelector: { }
affinity: { }
tolerations: [ ]
# Add common labels to all objects and pods defined in this chart.
labels: { }
# Ingress configuration
ingress:
# Enable ingress resource
enabled: false
# Configs for the Ingress of the web Service
web:
# Annotations for the web Ingress
annotations: { }
# The path for the web Ingress
path: ""
# The hostname for the web Ingress
host: ""
# configs for web Ingress TLS
tls:
# Enable TLS termination for the web Ingress
enabled: false
# the name of a pre-created Secret containing a TLS private key and certificate
secretName: ""
# HTTP paths to add to the web Ingress before the default path
precedingPaths: [ ]
# Http paths to add to the web Ingress after the default path
succeedingPaths: [ ]
# Configs for the Ingress of the flower Service
flower:
# Annotations for the flower Ingress
annotations: { }
# The path for the flower Ingress
path: ""
# The hostname for the flower Ingress
host: ""
# configs for web Ingress TLS
tls:
# Enable TLS termination for the flower Ingress
enabled: false
# the name of a pre-created Secret containing a TLS private key and certificate
secretName: ""
# HTTP paths to add to the flower Ingress before the default path
precedingPaths: [ ]
# Http paths to add to the flower Ingress after the default path
succeedingPaths: [ ]
# Network policy configuration
networkPolicies:
# Enabled network policies
enabled: false
# Extra annotations to apply to all
# Airflow pods
airflowPodAnnotations: { }
# Enable RBAC (default on most clusters these days)
rbacEnabled: true
# Airflow executor
# Options: SequentialExecutor, LocalExecutor, CeleryExecutor, KubernetesExecutor
executor: "KubernetesExecutor"
# If this is true and using LocalExecutor/SequentialExecutor/KubernetesExecutor, the scheduler's
# service account will have access to communicate with the api-server and launch pods.
# If this is true and using the CeleryExecutor, the workers will be able to launch pods.
allowPodLaunching: true
# Images
images:
airflow:
repository: ~
tag: ~
pullPolicy: IfNotPresent
pod_template:
repository: ~
tag: ~
pullPolicy: IfNotPresent
flower:
repository: ~
tag: ~
pullPolicy: IfNotPresent
statsd:
repository: apache/airflow
tag: airflow-statsd-exporter-2020.09.05-v0.17.0
pullPolicy: IfNotPresent
redis:
repository: redis
tag: 6-buster
pullPolicy: IfNotPresent
pgbouncer:
repository: apache/airflow
tag: airflow-pgbouncer-2020.09.05-1.14.0
pullPolicy: IfNotPresent
pgbouncerExporter:
repository: apache/airflow
tag: airflow-pgbouncer-exporter-2020.09.25-0.5.0
pullPolicy: IfNotPresent
gitSync:
repository: k8s.gcr.io/git-sync
tag: v3.1.6
pullPolicy: IfNotPresent
# Environment variables for all airflow containers
env:
- name: "AIRFLOW__KUBERNETES__GIT_SYNC_RUN_AS_USER"
value: "65533"
# Secrets for all airflow containers
secret: [ ]
# - envName: ""
# secretName: ""
# secretKey: ""
# Extra secrets that will be managed by the chart
# (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values).
# The format is "key/value" where
# * key (can be templated) is the the name the secret that will be created
# * value: an object with the standard 'data' or 'stringData' key (or both).
# The value associated with those keys must be a string (can be templated)
extraSecrets: { }
# eg:
# extraSecrets:
# {{ .Release.Name }}-airflow-connections:
# data: |
# AIRFLOW_CONN_GCP: 'base64_encoded_gcp_conn_string'
# AIRFLOW_CONN_AWS: 'base64_encoded_aws_conn_string'
# stringData: |
# AIRFLOW_CONN_OTHER: 'other_conn'
# {{ .Release.Name }}-other-secret-name-suffix: |
# data: |
# ...
# Extra ConfigMaps that will be managed by the chart
# (You can use them with extraEnv or extraEnvFrom or some of the extraVolumes values).
# The format is "key/value" where
# * key (can be templated) is the the name the configmap that will be created
# * value: an object with the standard 'data' key.
# The value associated with this keys must be a string (can be templated)
extraConfigMaps: { }
# eg:
# extraConfigMaps:
# {{ .Release.Name }}-airflow-variables:
# data: |
# AIRFLOW_VAR_HELLO_MESSAGE: "Hi!"
# AIRFLOW_VAR_KUBERNETES_NAMESPACE: "{{ .Release.Namespace }}"
# Extra env 'items' that will be added to the definition of airflow containers
# a string is expected (can be templated).
extraEnv: ~
# eg:
# extraEnv: |
# - name: PLATFORM
# value: FR
# Extra envFrom 'items' that will be added to the definition of airflow containers
# A string is expected (can be templated).
extraEnvFrom: ~
# eg:
# extraEnvFrom: |
# - secretRef:
# name: '{{ .Release.Name }}-airflow-connections'
# - configMapRef:
# name: '{{ .Release.Name }}-airflow-variables'
# Airflow database config
data:
# If secret names are provided, use those secrets
metadataSecretName: ~
resultBackendSecretName: ~
# Otherwise pass connection values in
metadataConnection:
user: postgres
pass: postgres
host: ~
port: 5432
db: postgres
sslmode: disable
resultBackendConnection:
user: postgres
pass: postgres
host: ~
port: 5432
db: postgres
sslmode: disable
# Fernet key settings
fernetKey: ~
fernetKeySecretName: ~
# In order to use kerberos you need to create secret containing the keytab file
# The secret name should follow naming convention of the application where resources are
# name {{ .Release-name }}-<POSTFIX>. In case of the keytab file, the postfix is "kerberos-keytab"
# So if your release is named "my-release" the name of the secret should be "my-release-kerberos-keytab"
#
# The Keytab content should be available in the "kerberos.keytab" key of the secret.
#
# apiVersion: v1
# kind: Secret
# data:
# kerberos.keytab: <base64_encoded keytab file content>
# type: Opaque
#
#
# If you have such keytab file you can do it with similar
#
# kubectl create secret generic {{ .Release.name }}-kerberos-keytab --from-file=kerberos.keytab
#
kerberos:
enabled: false
ccacheMountPath: '/var/kerberos-ccache'
ccacheFileName: 'cache'
configPath: '/etc/krb5.conf'
keytabPath: '/etc/airflow.keytab'
principal: '[email protected]'
reinitFrequency: 3600
config: |
# This is an example config showing how you can use templating and how "example" config
# might look like. It works with the test kerberos server that we are using during integration
# testing at Apache Airflow (see `scripts/ci/docker-compose/integration-kerberos.yml` but in
# order to make it production-ready you must replace it with your own configuration that
# Matches your kerberos deployment. Administrators of your Kerberos instance should
# provide the right configuration.
[logging]
default = "FILE:{{ template "airflow_logs_no_quote" . }}/kerberos_libs.log"
kdc = "FILE:{{ template "airflow_logs_no_quote" . }}/kerberos_kdc.log"
admin_server = "FILE:{{ template "airflow_logs_no_quote" . }}/kadmind.log"
[libdefaults]
default_realm = FOO.COM
ticket_lifetime = 10h
renew_lifetime = 7d
forwardable = true
[realms]
FOO.COM = {
kdc = kdc-server.foo.com
admin_server = admin_server.foo.com
}
# Airflow Worker Config
workers:
# Number of airflow celery workers in StatefulSet
replicas: 1
# Allow KEDA autoscaling.
# Persistence.enabled must be set to false to use KEDA.
keda:
enabled: false
namespaceLabels: { }
# How often KEDA polls the airflow DB to report new scale requests to the HPA
pollingInterval: 5
# How many seconds KEDA will wait before scaling to zero.
# Note that HPA has a separate cooldown period for scale-downs
cooldownPeriod: 30
# Maximum number of workers created by keda
maxReplicaCount: 10
persistence:
# Enable persistent volumes
enabled: true
# Volume size for worker StatefulSet
size: 100Gi
# If using a custom storageClass, pass name ref to all statefulSets here
storageClassName:
# Execute init container to chown log directory.
# This is currently only needed in KinD, due to usage
# of local-path provisioner.
fixPermissions: false
kerberosSidecar:
# Enable kerberos sidecar
enabled: false
resources: { }
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
# Grace period for tasks to finish after SIGTERM is sent from kubernetes
terminationGracePeriodSeconds: 600
# This setting tells kubernetes that its ok to evict
# when it wants to scale a node down.
safeToEvict: true
# Annotations to add to worker kubernetes service account.
serviceAccountAnnotations: { }
# Mount additional volumes into worker.
extraVolumes: [ ]
extraVolumeMounts: [ ]
# Airflow scheduler settings
scheduler:
# Airflow 2.0 allows users to run multiple schedulers,
# However this feature is only recommended for MySQL 8+ and Postgres
replicas: 1
# Scheduler pod disruption budget
podDisruptionBudget:
enabled: false
# PDB configuration
config:
maxUnavailable: 1
resources: { }
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
# This setting can overwrite
# podMutation settings.
airflowLocalSettings: ~
# This setting tells kubernetes that its ok to evict
# when it wants to scale a node down.
safeToEvict: true
# Annotations to add to scheduler kubernetes service account.
serviceAccountAnnotations: { }
# Mount additional volumes into scheduler.
extraVolumes: [ ]
extraVolumeMounts: [ ]
# Airflow webserver settings
webserver:
allowPodLogReading: true
livenessProbe:
initialDelaySeconds: 15
timeoutSeconds: 30
failureThreshold: 20
periodSeconds: 5
readinessProbe:
initialDelaySeconds: 15
timeoutSeconds: 30
failureThreshold: 20
periodSeconds: 5
# Number of webservers
replicas: 1
# Additional network policies as needed
extraNetworkPolicies: [ ]
resources: { }
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
# Create initial user.
defaultUser:
enabled: true
role: Admin
username: admin
email: [email protected]
firstName: admin
lastName: user
password: admin
# Mount additional volumes into webserver.
extraVolumes: [ ]
# - name: airflow-ui
# emptyDir: { }
extraVolumeMounts: [ ]
# - name: airflow-ui
# mountPath: /opt/airflow
# This will be mounted into the Airflow Webserver as a custom
# webserver_config.py. You can bake a webserver_config.py in to your image
# instead
webserverConfig: ~
# webserverConfig: |
# from airflow import configuration as conf
# # The SQLAlchemy connection string.
# SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')
# # Flask-WTF flag for CSRF
# CSRF_ENABLED = True
service:
type: NodePort
## service annotations
annotations: { }
# Annotations to add to webserver kubernetes service account.
serviceAccountAnnotations: { }
# Flower settings
flower:
# Additional network policies as needed
extraNetworkPolicies: [ ]
resources: { }
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
# A secret containing the connection
secretName: ~
# Else, if username and password are set, create secret from username and password
username: ~
password: ~
service:
type: ClusterIP
# Statsd settings
statsd:
enabled: true
# Additional network policies as needed
extraNetworkPolicies: [ ]
resources: { }
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
service:
extraAnnotations: { }
# Pgbouncer settings
pgbouncer:
# Enable pgbouncer
enabled: false
# Additional network policies as needed
extraNetworkPolicies: [ ]
# Pool sizes
metadataPoolSize: 10
resultBackendPoolSize: 5
# Maximum clients that can connect to pgbouncer (higher = more file descriptors)
maxClientConn: 100
# Pgbouner pod disruption budget
podDisruptionBudget:
enabled: false
# PDB configuration
config:
maxUnavailable: 1
# Limit the resources to pgbouncerExported.
# When you specify the resource request the scheduler uses this information to decide which node to place
# the Pod on. When you specify a resource limit for a Container, the kubelet enforces those limits so
# that the running container is not allowed to use more of that resource than the limit you set.
# See: https://kubernetes.io/docs/concepts/configuration/manage-resources-containers/
# Example:
#
# resource:
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
resources: { }
service:
extraAnnotations: { }
# https://www.pgbouncer.org/config.html
verbose: 0
logDisconnections: 0
logConnections: 0
sslmode: "prefer"
ciphers: "normal"
ssl:
ca: ~
cert: ~
key: ~
redis:
terminationGracePeriodSeconds: 600
persistence:
# Enable persistent volumes
enabled: true
# Volume size for worker StatefulSet
size: 1Gi
# If using a custom storageClass, pass name ref to all statefulSets here
storageClassName:
resources: { }
# limits:
# cpu: 100m
# memory: 128Mi
# requests:
# cpu: 100m
# memory: 128Mi
# If set use as redis secret
passwordSecretName: ~
brokerURLSecretName: ~
# Else, if password is set, create secret with it,
# else generate a new one on install
password: ~
# This setting tells kubernetes that its ok to evict
# when it wants to scale a node down.
safeToEvict: true
# Auth secret for a private registry
# This is used if pulling airflow images from a private registry
registry:
secretName: ~
# Example:
# connection:
# user: ~
# pass: ~
# host: ~
# email: ~
connection: { }
# Elasticsearch logging configuration
elasticsearch:
# Enable elasticsearch task logging
enabled: true
# A secret containing the connection
# secretName: ~
# Or an object representing the connection
# Example:
connection:
# user:
# pass:
host: elasticsearch-master-headless.elk.svc.cluster.local
port: 9200
# connection: {}
# All ports used by chart
ports:
flowerUI: 5555
airflowUI: 8080
workerLogs: 8793
redisDB: 6379
statsdIngest: 9125
statsdScrape: 9102
pgbouncer: 6543
pgbouncerScrape: 9127
# Define any ResourceQuotas for namespace
quotas: { }
# Define default/max/min values for pods and containers in namespace
limits: [ ]
# This runs as a CronJob to cleanup old pods.
cleanup:
enabled: false
# Run every 15 minutes
schedule: "*/15 * * * *"
# Configuration for postgresql subchart
# Not recommended for production
postgresql:
enabled: true
postgresqlPassword: postgres
postgresqlUsername: postgres
# Config settings to go into the mounted airflow.cfg
#
# Please note that these values are passed through the `tpl` function, so are
# all subject to being rendered as go templates. If you need to include a
# literal `{{` in a value, it must be expressed like this:
#
# a: '{{ "{{ not a template }}" }}'
#
# yamllint disable rule:line-length
config:
core:
dags_folder: '{{ include "airflow_dags" . }}'
load_examples: 'False'
executor: '{{ .Values.executor }}'
# For Airflow 1.10, backward compatibility
colored_console_log: 'True'
remote_logging: '{{- ternary "True" "False" .Values.elasticsearch.enabled }}'
# Authentication backend used for the experimental API
api:
auth_backend: airflow.api.auth.backend.deny_all
logging:
remote_logging: '{{- ternary "True" "False" .Values.elasticsearch.enabled }}'
colored_console_log: 'True'
logging_level: INFO
metrics:
statsd_on: '{{ ternary "True" "False" .Values.statsd.enabled }}'
statsd_port: 9125
statsd_prefix: airflow
statsd_host: '{{ printf "%s-statsd" .Release.Name }}'
webserver:
enable_proxy_fix: 'True'
expose_config: 'True'
rbac: 'True'
celery:
default_queue: celery
scheduler:
scheduler_heartbeat_sec: 5
# For Airflow 1.10, backward compatibility
statsd_on: '{{ ternary "True" "False" .Values.statsd.enabled }}'
statsd_port: 9125
statsd_prefix: airflow
statsd_host: '{{ printf "%s-statsd" .Release.Name }}'
# Restart Scheduler every 41460 seconds (11 hours 31 minutes)
# The odd time is chosen so it is not always restarting on the same "hour" boundary
run_duration: 41460
elasticsearch:
json_format: 'True'
log_id_template: "{dag_id}_{task_id}_{execution_date}_{try_number}"
elasticsearch_configs:
max_retries: 3
timeout: 30
retry_timeout: 'True'
kerberos:
keytab: '{{ .Values.kerberos.keytabPath }}'
reinit_frequency: '{{ .Values.kerberos.reinitFrequency }}'
principal: '{{ .Values.kerberos.principal }}'
ccache: '{{ .Values.kerberos.ccacheMountPath }}/{{ .Values.kerberos.ccacheFileName }}'
kubernetes:
namespace: '{{ .Release.Namespace }}'
airflow_configmap: '{{ include "airflow_config" . }}'
airflow_local_settings_configmap: '{{ include "airflow_config" . }}'
pod_template_file: '{{ include "airflow_pod_template_file" . }}/pod_template_file.yaml'
worker_container_repository: '{{ .Values.images.airflow.repository | default .Values.defaultAirflowRepository }}'
worker_container_tag: '{{ .Values.images.airflow.tag | default .Values.defaultAirflowTag }}'
delete_worker_pods: 'False'
multi_namespace_mode: '{{ if .Values.multiNamespaceMode }}True{{ else }}False{{ end }}'
# yamllint enable rule:line-length
multiNamespaceMode: false
podTemplate:
# Git sync
dags:
persistence:
# Enable persistent volume for storing dags
enabled: false
# Volume size for dags
size: 1Gi
# If using a custom storageClass, pass name here
storageClassName: gp2
# access mode of the persistent volume
accessMode: ReadWriteMany
## the name of an existing PVC to use
existingClaim: "airflow-dags"
gitSync:
enabled: true
repo: [email protected]:Tikna-inc/airflow.git
branch: main
rev: HEAD
root: "/git"
dest: "repo"
depth: 1
maxFailures: 0
subPath: ""
sshKeySecret: airflow-ssh-secret
wait: 60
containerName: git-sync
uid: 65533
```
**and this is the dag with its tasks**
```
from datetime import timedelta
import requests
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
logging.getLogger().setLevel(level=logging.INFO)
default_args = {
'owner': 'airflow',
'depends_on_past': False,
'email': ['[email protected]'],
'email_on_failure': False,
'email_on_retry': False,
'retries': 1,
'retry_delay': timedelta(minutes=5),
}
def get_active_customers():
requests.get("localhost:8080")
dag = DAG(
'bash',
default_args=default_args,
description='A simple test DAG',
schedule_interval='*/2 * * * *',
start_date=days_ago(1),
tags=['Test'],
is_paused_upon_creation=False,
catchup=False
)
t1 = BashOperator(
task_id='print_date',
bash_command='mkdir ./itsMe',
dag=dag
)
t1
```
This is airflow.cfg file
```cfg
[api]
auth_backend = airflow.api.auth.backend.deny_all
[celery]
default_queue = celery
[core]
colored_console_log = True
dags_folder = /opt/airflow/dags/repo/
executor = KubernetesExecutor
load_examples = False
remote_logging = False
[elasticsearch]
json_format = True
log_id_template = {dag_id}_{task_id}_{execution_date}_{try_number}
[elasticsearch_configs]
max_retries = 3
retry_timeout = True
timeout = 30
[kerberos]
ccache = /var/kerberos-ccache/cache
keytab = /etc/airflow.keytab
principal = [email protected]
reinit_frequency = 3600
[kubernetes]
airflow_configmap = airflow-airflow-config
airflow_local_settings_configmap = airflow-airflow-config
dags_in_image = False
delete_worker_pods = False
multi_namespace_mode = False
namespace = airflow
pod_template_file = /opt/airflow/pod_templates/pod_template_file.yaml
worker_container_repository = apache/airflow
worker_container_tag = 2.0.0
[logging]
colored_console_log = True
logging_level = INFO
remote_logging = False
[metrics]
statsd_host = airflow-statsd
statsd_on = True
statsd_port = 9125
statsd_prefix = airflow
[scheduler]
run_duration = 41460
scheduler_heartbeat_sec = 5
statsd_host = airflow-statsd
statsd_on = True
statsd_port = 9125
statsd_prefix = airflow
[webserver]
enable_proxy_fix = True
expose_config = True
```
This is the pod yaml file for the new tasks
```
apiVersion: v1
kind: Pod
metadata:
annotations:
dag_id: bash2
execution_date: "2021-01-14T20:16:00+00:00"
kubernetes.io/psp: eks.privileged
task_id: create_dir
try_number: "2"
labels:
airflow-worker: "38"
airflow_version: 2.0.0
dag_id: bash2
execution_date: 2021-01-14T20_16_00_plus_00_00
kubernetes_executor: "True"
task_id: create_dir
try_number: "2"
name: sss3
namespace: airflow
spec:
containers:
- args:
- airflow
- tasks
- run
- bash2
- create_dir
- "2021-01-14T20:16:00+00:00"
- --local
- --pool
- default_pool
- --subdir
- /opt/airflow/dags/repo/bash.py
env:
- name: AIRFLOW__CORE__EXECUTOR
value: LocalExecutor
- name: AIRFLOW__CORE__FERNET_KEY
valueFrom:
secretKeyRef:
key: fernet-key
name: airflow-fernet-key
- name: AIRFLOW__CORE__SQL_ALCHEMY_CONN
valueFrom:
secretKeyRef:
key: connection
name: airflow-airflow-metadata
- name: AIRFLOW_CONN_AIRFLOW_DB
valueFrom:
secretKeyRef:
key: connection
name: airflow-airflow-metadata
- name: AIRFLOW_IS_K8S_EXECUTOR_POD
value: "True"
image: apache/airflow:2.0.0
imagePullPolicy: IfNotPresent
name: base
resources: { }
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
volumeMounts:
- mountPath: /opt/airflow/logs
name: airflow-logs
- mountPath: /opt/airflow/airflow.cfg
name: config
readOnly: true
subPath: airflow.cfg
- mountPath: /etc/git-secret/ssh
name: git-sync-ssh-key
subPath: ssh
- mountPath: /opt/airflow/dags
name: dags
readOnly: true
- mountPath: /var/run/secrets/kubernetes.io/serviceaccount
name: airflow-worker-token-7sdtr
readOnly: true
dnsPolicy: ClusterFirst
enableServiceLinks: true
initContainers:
- env:
- name: GIT_SSH_KEY_FILE
value: /etc/git-secret/ssh
- name: GIT_SYNC_SSH
value: "true"
- name: GIT_KNOWN_HOSTS
value: "false"
- name: GIT_SYNC_REV
value: HEAD
- name: GIT_SYNC_BRANCH
value: main
- name: GIT_SYNC_REPO
value: [email protected]:Tikna-inc/airflow.git
- name: GIT_SYNC_DEPTH
value: "1"
- name: GIT_SYNC_ROOT
value: /git
- name: GIT_SYNC_DEST
value: repo
- name: GIT_SYNC_ADD_USER
value: "true"
- name: GIT_SYNC_WAIT
value: "60"
- name: GIT_SYNC_MAX_SYNC_FAILURES
value: "0"
- name: GIT_SYNC_ONE_TIME
value: "true"
image: k8s.gcr.io/git-sync:v3.1.6
imagePullPolicy: IfNotPresent
name: git-sync
resources: { }
securityContext:
runAsUser: 65533
terminationMessagePath: /dev/termination-log
terminationMessagePolicy: File
volumeMounts:
- mountPath: /git
name: dags
- mountPath: /etc/git-secret/ssh
name: git-sync-ssh-key
readOnly: true
subPath: gitSshKey
- mountPath: /var/run/secrets/kubernetes.io/serviceaccount
name: airflow-worker-token-7sdtr
readOnly: true
nodeName: ip-172-31-41-37.eu-south-1.compute.internal
priority: 0
restartPolicy: Never
schedulerName: default-scheduler
securityContext:
runAsUser: 50000
serviceAccount: airflow-worker
serviceAccountName: airflow-worker
terminationGracePeriodSeconds: 30
tolerations:
- effect: NoExecute
key: node.kubernetes.io/not-ready
operator: Exists
tolerationSeconds: 300
- effect: NoExecute
key: node.kubernetes.io/unreachable
operator: Exists
tolerationSeconds: 300
volumes:
- emptyDir: { }
name: dags
- name: git-sync-ssh-key
secret:
defaultMode: 288
secretName: airflow-ssh-secret
- emptyDir: { }
name: airflow-logs
- configMap:
defaultMode: 420
name: airflow-airflow-config
name: config
- name: airflow-worker-token-7sdtr
secret:
defaultMode: 420
secretName: airflow-worker-token-7sdtr
```
**-----------------------Important----------------------------**
**Debugging**
for debugging purpose I have changed the pod args rather than running the task, I ran it with
```
spec:
containers:
- args:
- airflow
- webserver
```
and tried to look for the Dags , and found None. It seems like gitSync is not working with the pods triggered by kubernetesExecutor.
Any help please ??? | https://github.com/apache/airflow/issues/13680 | https://github.com/apache/airflow/pull/13826 | 3909232fafd09ac72b49010ecdfd6ea48f06d5cf | 5f74219e6d400c4eae9134f6015c72430d6d549f | "2021-01-14T19:47:20Z" | python | "2021-02-04T19:01:46Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,676 | ["airflow/api_connexion/endpoints/xcom_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "tests/api_connexion/endpoints/test_xcom_endpoint.py"] | API Endpoints - /xcomEntries/{xcom_key} - doesn't return value | **Apache Airflow version**: 2.0.0
**What happened**:
Using endpoint `/dags/{dag_id}/dagRuns/{dag_run_id}/taskInstances/{task_id}/xcomEntries/{xcom_key}` I got Response Body but without `value` entry. Like:
```
{
"dag_id": "string",
"execution_date": "string",
"key": "string",
"task_id": "string",
"timestamp": "string"
}
```
Instead of:
```
{
"dag_id": "string",
"execution_date": "string",
"key": "string",
"task_id": "string",
"timestamp": "string",
"value": "string"
}
```
The exact value by defined `key` exists. | https://github.com/apache/airflow/issues/13676 | https://github.com/apache/airflow/pull/13684 | 2fef2ab1bf0f8c727a503940c9c65fd5be208386 | dc80fa4cbc070fc6e84fcc95799d185badebaa71 | "2021-01-14T15:57:46Z" | python | "2021-01-15T10:18:44Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,659 | ["docs/apache-airflow/howto/define_extra_link.rst"] | Operator Extra Links not showing up on UI | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.18
**Environment**:
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-release): Linux
- **Kernel** (e.g. `uname -a`): Linux
- **Install tools**:
- **Others**:
**What happened**:
Followed the Example Here: https://airflow.apache.org/docs/apache-airflow/stable/howto/define_extra_link.html#define-an-operator-extra-link, and was expecting Link to show up on UI but it does not :(

```
class GoogleLink(BaseOperatorLink):
name = "Google"
def get_link(self, operator, dttm):
return "https://www.google.com"
class MyFirstOperator(BaseOperator):
operator_extra_links = (
GoogleLink(),
)
@apply_defaults
def __init__(self, **kwargs):
super().__init__(**kwargs)
def execute(self, context):
self.log.info("Hello World!")
print(self.extra_links)
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**: I expected a Button Link to show up in Task Instance Model
<!-- What do you think went wrong? -->
**How to reproduce it**: Follow Example here on Airflow 2.0 https://airflow.apache.org/docs/apache-airflow/stable/howto/define_extra_link.html#define-an-operator-extra-link
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/13659 | https://github.com/apache/airflow/pull/13683 | 3558538883612a10e9ea3521bf864515b6e560c5 | 3d21082adc3bde63a15dad4db85b448ff695cfc6 | "2021-01-13T20:55:43Z" | python | "2021-01-15T12:21:53Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,656 | ["airflow/www/static/js/connection_form.js"] | Password is unintendedly changed when editing a connection | **Apache Airflow version**: 2.0.0
**What happened**:
When editing a connection - without changing the password - and saving the edited connection, a wrong password is saved.
**What you expected to happen**:
If I do not change the password in the UI, I expect that the password is not changed.
**How to reproduce it**:
- Create a new connection and save it (screenshots 1 + 2)
- Edit the connection without editing the password, and save it again (screenshots 3 + 4)
If you _do_ edit the password, the (new or old) password is saved correctly.
*Screenshot 1*

*Screenshot 2*

*Screenshot 3*

*Screenshot 4*

(I blurred out the full string in the unlikely case that the full string might contain information on my fernet key or something) | https://github.com/apache/airflow/issues/13656 | https://github.com/apache/airflow/pull/15073 | 1627323a197bba2c4fbd71816a9a6bd3f78c1657 | b4374d33b0e5d62c3510f1f5ac4a48e7f48cb203 | "2021-01-13T16:34:22Z" | python | "2021-03-29T19:12:15Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,653 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/dag_schema.py", "tests/api_connexion/endpoints/test_dag_endpoint.py", "tests/api_connexion/schemas/test_dag_schema.py"] | API Endpoint for Airflow V1 - DAGs details | **Description**
We need to have the endpoint in Airflow V1 to retrieve details of existing DAG, e.g. `GET /dags/{dag_id}/details `
**Use case / motivation**
We want to be able to retrieve/discover the parameters that a DAG accepts. We can see that you pass parameters when you execute a dag via the conf object. We can also see that you explicitly declare parameters that a DAG accepts via the params argument when creating the DAG.
However, we can't see anywhere via either the REST API or CLI that allows you to retrieve this information from a DAG (note that we are not saying a DAG run).
It doesn't even look like version 2 API supports this although the OpenAPI spec mentions a dags/{dag_id}/details endpoint but this is not documented. We found the related GitHub issue for this new endpoint and it is done but looks like documentation isn't yet updated.
Please can you:
1. Provide the response for the v2 details endpoint
2. Advise when v2 documentation will be updated with the details endpoint.
3. Advise if there is a workaround for us doing this on v1.1
**Related Issues**
#8138
| https://github.com/apache/airflow/issues/13653 | https://github.com/apache/airflow/pull/13790 | 2c6c7fdb2308de98e142618836bdf414df9768c8 | 10b8ecc86f24739a38e56347dcc8dc60e3e43975 | "2021-01-13T14:21:27Z" | python | "2021-01-21T15:42:19Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,638 | ["airflow/utils/log/file_task_handler.py", "tests/utils/test_log_handlers.py"] | Stable API task logs | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**:
- **Cloud provider or hardware configuration**: PC (docker-compose)
- **OS** (e.g. from /etc/os-release): Linux mint 20 (for PC), Debian Buster in container
- **Kernel** (e.g. `uname -a`): Linux 607a1bfeebd2 5.4.0-60-generic #67-Ubuntu SMP Tue Jan 5 18:31:36 UTC 2021 x86_64 GNU/Linux
- **Install tools**: Poetry (so pipy)
- **Others**:
Using python 3.8.6, with Celery Executor, one worker
Task did run properly
**What happened**:
I tried to get the logs of a task instance using the stable Rest API through the Swagger UI included in Airflow, and it crashed (got a stack trace)
I got 500 error
```
engine-webserver_1 | 2021-01-12T16:45:18.465370280Z [2021-01-12 16:45:18,464] {app.py:1891} ERROR - Exception on /api/v1/dags/insert/dagRuns/manual__2021-01-12T15:05:59.560500+00:00/taskInstances/insert-db/logs/0 [GET]
engine-webserver_1 | 2021-01-12T16:45:18.465391147Z Traceback (most recent call last):
engine-webserver_1 | 2021-01-12T16:45:18.465394643Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 2447, in wsgi_app
engine-webserver_1 | 2021-01-12T16:45:18.465397709Z response = self.full_dispatch_request()
engine-webserver_1 | 2021-01-12T16:45:18.465400161Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1952, in full_dispatch_request
engine-webserver_1 | 2021-01-12T16:45:18.465402912Z rv = self.handle_user_exception(e)
engine-webserver_1 | 2021-01-12T16:45:18.465405405Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1821, in handle_user_exception
engine-webserver_1 | 2021-01-12T16:45:18.465407715Z reraise(exc_type, exc_value, tb)
engine-webserver_1 | 2021-01-12T16:45:18.465409739Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/_compat.py", line 39, in reraise
engine-webserver_1 | 2021-01-12T16:45:18.465412258Z raise value
engine-webserver_1 | 2021-01-12T16:45:18.465414560Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1950, in full_dispatch_request
engine-webserver_1 | 2021-01-12T16:45:18.465425555Z rv = self.dispatch_request()
engine-webserver_1 | 2021-01-12T16:45:18.465427999Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/flask/app.py", line 1936, in dispatch_request
engine-webserver_1 | 2021-01-12T16:45:18.465429697Z return self.view_functions[rule.endpoint](**req.view_args)
engine-webserver_1 | 2021-01-12T16:45:18.465431146Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/decorator.py", line 48, in wrapper
engine-webserver_1 | 2021-01-12T16:45:18.465433001Z response = function(request)
engine-webserver_1 | 2021-01-12T16:45:18.465434308Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/uri_parsing.py", line 144, in wrapper
engine-webserver_1 | 2021-01-12T16:45:18.465435841Z response = function(request)
engine-webserver_1 | 2021-01-12T16:45:18.465437122Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/validation.py", line 384, in wrapper
engine-webserver_1 | 2021-01-12T16:45:18.465438620Z return function(request)
engine-webserver_1 | 2021-01-12T16:45:18.465440074Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/response.py", line 103, in wrapper
engine-webserver_1 | 2021-01-12T16:45:18.465441667Z response = function(request)
engine-webserver_1 | 2021-01-12T16:45:18.465443086Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/connexion/decorators/parameter.py", line 121, in wrapper
engine-webserver_1 | 2021-01-12T16:45:18.465445345Z return function(**kwargs)
engine-webserver_1 | 2021-01-12T16:45:18.465446713Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/api_connexion/security.py", line 47, in decorated
engine-webserver_1 | 2021-01-12T16:45:18.465448202Z return func(*args, **kwargs)
engine-webserver_1 | 2021-01-12T16:45:18.465449538Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/utils/session.py", line 65, in wrapper
engine-webserver_1 | 2021-01-12T16:45:18.465451032Z return func(*args, session=session, **kwargs)
engine-webserver_1 | 2021-01-12T16:45:18.465452504Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/api_connexion/endpoints/log_endpoint.py", line 81, in get_log
engine-webserver_1 | 2021-01-12T16:45:18.465454135Z logs, metadata = task_log_reader.read_log_chunks(ti, task_try_number, metadata)
engine-webserver_1 | 2021-01-12T16:45:18.465455658Z File "/brain/engine/.cache/poetry/meta-vSi4r4R8-py3.8/lib/python3.8/site-packages/airflow/utils/log/log_reader.py", line 58, in read_log_chunks
engine-webserver_1 | 2021-01-12T16:45:18.465457226Z logs, metadatas = self.log_handler.read(ti, try_number, metadata=metadata)
engine-webserver_1 | 2021-01-12T16:45:18.465458632Z ValueError: not enough values to unpack (expected 2, got 1)
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
I expected to get the logs of my task
<!-- What do you think went wrong? -->
**How to reproduce it**:
I think it's everytime (at least on my side)
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
Other stable API call, such as getting list of dags runs, task instance, etc worked well.
Logs is appearing well if I go to
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
-->
EDIT : Ok, I'm stupid, I put 0 as try number, instead of 1...
So not a big bug, though I think 0 as try number should be a 400 status response, not 500 crash.
Should I keep it open ? | https://github.com/apache/airflow/issues/13638 | https://github.com/apache/airflow/pull/14001 | 32d2c25e2dd1fd069f51bdfdd79595f12047a867 | 2366f861ee97f50e2cff83d557a1ae97030febf9 | "2021-01-12T17:10:25Z" | python | "2021-02-01T13:33:30Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,637 | ["UPDATING.md", "airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg"] | Scheduler takes 100% of CPU without task execution | Hi,
running airflow 2.0.0 with python 3.6.9 the scheduler is consuming much CPU time without execution any task:
PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND
15758 oli 20 0 42252 3660 3124 R 100.0 0.0 0:00.06 top
16764 oli 20 0 590272 90648 15468 R 200.0 0.3 0:00.59 airflow schedul
16769 oli 20 0 588808 77236 13900 R 200.0 0.3 0:00.55 airflow schedul
1 root 20 0 1088 548 516 S 0.0 0.0 0:13.28 init
10 root 20 0 900 80 16 S 0.0 0.0 0:00.00 init | https://github.com/apache/airflow/issues/13637 | https://github.com/apache/airflow/pull/13664 | 9536ad906f1591a5a0f82f69ba3bd214c4516c5b | e4b8ee63b04a25feb21a5766b1cc997aca9951a9 | "2021-01-12T14:16:04Z" | python | "2021-01-14T13:08:12Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,624 | ["airflow/www/templates/airflow/dags.html"] | Misleading dag pause info tooltip |
**Apache Airflow version**:
2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
N/A
**Environment**:
N/A
**What happened**:
The UI tooltip is misleading and confuses the user.
Tooltip says " use this toggle to pause the dag" which implies that if the toggle is set to **ON** the flow is paused, but in fact it's the reverse of that.
Either the logic should be reversed so that if the toggle is on, the DAG is paused, or the wording should be changed to explicitly state the actual functionality of the "on state" of the toggle.
something like "When this toggle is ON, the DAG will be executed at scheduled times, turn this toggle off to pause executions of this dag ".
**What you expected to happen**:
UI tooltip should be honest and clear about its function.
**How to reproduce it**:
open DAGs window of the airflow webserver in a supported browser, hold mouse over the (i) on the second cell from left on the top row.
<img width="534" alt="Screen Shot 2021-01-11 at 12 27 18 PM" src="https://user-images.githubusercontent.com/14813957/104258476-7bfad200-5434-11eb-8152-443f05071e4b.png">
| https://github.com/apache/airflow/issues/13624 | https://github.com/apache/airflow/pull/13642 | 3d538636984302013969aa82a04d458d24866403 | c4112e2e9deaa2e30e6fd05d43221023d0d7d40b | "2021-01-12T01:46:46Z" | python | "2021-01-12T19:14:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,602 | ["airflow/www/utils.py", "tests/www/test_utils.py"] | WebUI returns an error when logs that do not use a DAG list `None` as the DAG ID | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**:
- **Cloud provider or hardware configuration**: docker-compose
- **OS** (e.g. from /etc/os-release): Docker `apache/airflow` `sha256:b4f957bef5a54ca0d781ae1431d8485f125f0b5d18f3bc7e0416c46e617db265`
- **Kernel** (e.g. `uname -a`): Linux c697ae3a0397 5.4.0-58-generic #64~18.04.1-Ubuntu SMP Wed Dec 9 17:11:11 UTC 2020 x86_64 GNU/Linux
- **Install tools**: docker
- **Others**:
**What happened**:
When an event that does not include a DAG is logged in the UI, this event lists the DAG ID as "None". This "None" is treated as an actual DAG ID with a link, which throws an error if clicked.
```
Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.
Python version: 3.6.12
Airflow version: 2.0.0
Node: 9097c882a712
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/airflow/.local/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/home/airflow/.local/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/auth.py", line 34, in decorated
return func(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/decorators.py", line 97, in view_func
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/decorators.py", line 60, in wrapper
return f(*args, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/session.py", line 65, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/www/views.py", line 2028, in graph
dag = current_app.dag_bag.get_dag(dag_id)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/utils/session.py", line 65, in wrapper
return func(*args, session=session, **kwargs)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dagbag.py", line 171, in get_dag
self._add_dag_from_db(dag_id=dag_id, session=session)
File "/home/airflow/.local/lib/python3.6/site-packages/airflow/models/dagbag.py", line 227, in _add_dag_from_db
raise SerializedDagNotFound(f"DAG '{dag_id}' not found in serialized_dag table")
airflow.exceptions.SerializedDagNotFound: DAG 'None' not found in serialized_dag table
```
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
I expected `None` to not be a link or have it link to some sort of error page. Instead it throws an error.
**How to reproduce it**:
Run a CLI command such as `airflow dags list`, then go to `/log/list/` in the web UI, and click on the `None` *Dag Id* for the logged event for the command.

**Anything else we need to know**:
This problem appears to occur every time. | https://github.com/apache/airflow/issues/13602 | https://github.com/apache/airflow/pull/13619 | eb40eea81be95ecd0e71807145797b6d82375885 | 8ecdef3e50d3b83901d70a13794ae6afabc4964e | "2021-01-11T01:26:42Z" | python | "2021-01-12T10:16:01Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,597 | ["airflow/www/static/js/connection_form.js", "airflow/www/views.py"] | Extra field widgets of custom connections do not properly save data | **Apache Airflow version**: 2.0.0
**Environment**: Docker image `apache/airflow:2.0.0-python3.8` on Win10 with WSL
**What happened**:
I built a custom provider with a number of custom connections.
This works:
- The connections are properly registered
- The UI does not show hidden fields as per `get_ui_field_behaviour`
- The UI correctly relabels fields as per `get_ui_field_behaviour`
- The UI correctly shows added widgets as per `get_connection_form_widgets` (well, mostly)
What does not work:
- The UI does not save values entered for additional widgets
I used the [JDBC example](https://github.com/apache/airflow/blob/master/airflow/providers/jdbc/hooks/jdbc.py) to string myself along by copying it and pasting it as a hook into my custom provider package. (I did not install the JDBC provider package, unless it is installed in the image I use - but if I don't add it in my own provider package, I don't have the connection type in the UI, so I assume it is not). Curiously, The JDBC hook works just fine. I then created the following file:
```Python
"""
You find two child classes of DbApiHook in here. One is the exact copy of the JDBC
provider hook, minus some irrelevant logic (I only care about the UI stuff here).
The other is the exact same thing, except I added an "x" behind every occurance
of "jdbc" in strings and names.
"""
from typing import Any, Dict, Optional
from airflow.hooks.dbapi import DbApiHook
class JdbcXHook(DbApiHook):
"""
Copy of JdbcHook below. Added an "x" at various places, including the class name.
"""
conn_name_attr = 'jdbcx_conn_id' # added x
default_conn_name = 'jdbcx_default' # added x
conn_type = 'jdbcx' # added x
hook_name = 'JDBCx Connection' # added x
supports_autocommit = True
@staticmethod
def get_connection_form_widgets() -> Dict[str, Any]:
"""Returns connection widgets to add to connection form"""
from flask_appbuilder.fieldwidgets import BS3TextFieldWidget
from flask_babel import lazy_gettext
from wtforms import StringField
# added an x in the keys
return {
"extra__jdbcx__drv_path": StringField(lazy_gettext('Driver Path'), widget=BS3TextFieldWidget()),
"extra__jdbcx__drv_clsname": StringField(
lazy_gettext('Driver Class'), widget=BS3TextFieldWidget()
),
}
@staticmethod
def get_ui_field_behaviour() -> Dict:
"""Returns custom field behaviour"""
return {
"hidden_fields": ['port', 'schema', 'extra'],
"relabeling": {'host': 'Connection URL'},
}
class JdbcHook(DbApiHook):
"""
General hook for jdbc db access.
JDBC URL, username and password will be taken from the predefined connection.
Note that the whole JDBC URL must be specified in the "host" field in the DB.
Raises an airflow error if the given connection id doesn't exist.
"""
conn_name_attr = 'jdbc_conn_id'
default_conn_name = 'jdbc_default'
conn_type = 'jdbc'
hook_name = 'JDBC Connection plain'
supports_autocommit = True
@staticmethod
def get_connection_form_widgets() -> Dict[str, Any]:
"""Returns connection widgets to add to connection form"""
from flask_appbuilder.fieldwidgets import BS3TextFieldWidget
from flask_babel import lazy_gettext
from wtforms import StringField
return {
"extra__jdbc__drv_path": StringField(lazy_gettext('Driver Path'), widget=BS3TextFieldWidget()),
"extra__jdbc__drv_clsname": StringField(
lazy_gettext('Driver Class'), widget=BS3TextFieldWidget()
),
}
@staticmethod
def get_ui_field_behaviour() -> Dict:
"""Returns custom field behaviour"""
return {
"hidden_fields": ['port', 'schema', 'extra'],
"relabeling": {'host': 'Connection URL'},
}
```
**What you expected to happen**:
After doing the above, I expected
- Seeing both in the add connection UI
- Being able to use both the same way
**What actually happenes**:
- I _do_ see both in the UI (Screenshot 1)
- For some reason, the "normal" hook has BOTH extra fields - not just his own two? (Screenshot 2)
- If I add the connection as in Screenshot 2, they are saved in the four fields (his own two + the two for the "x" hook) properly as shown in Screenshot 3
- If I seek to edit the connection again, they are also they - all four fields - with the correct values in the UI
- If I add the connection for the "x" type as in Screenshot 4, it ostensibly saves it - with two fields as defined in the code
- You can see in screenshot 5, that the extra is saved as an empty string?!
- When trying to edit the connection in the UI, you also see that there is no data saved for two extra widgets?!
- I added a few more screenshots of airflow providers CLI command results (note that the package `ewah` has a number of other custom hooks, and the issue above occurs for *all* of them)
*Screenshot 1:*

*Screenshot 2:*

*Screenshot 3:*

*Screenshot 4:*

*Screenshot 5:*

*Screenshot 6 - airflow providers behaviours:*

*Screenshot 7 - airflow providers get:*

(Note: This error occurs with pre-installed providers as well)
*Screenshot 8 - airflow providers hooks:*

*Screenshot 9 - aorflow providers list:*

*Screenshot 10 - airflow providers widgets:*

**How to reproduce it**:
- create a custom provider package
- add the code snippet pasted above somewhere
- add the two classes to the `hook-class-names` list in the provider info
- install the provider package
- do what I described above
| https://github.com/apache/airflow/issues/13597 | https://github.com/apache/airflow/pull/13640 | 34eb203c5177bc9be91a9387d6a037f6fec9dba1 | b007fc33d481f0f1341d1e1e4cba719a5fe6580d | "2021-01-10T12:00:44Z" | python | "2021-01-12T23:32:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,559 | ["airflow/models/taskinstance.py"] | Nested templated variables do not always render | **Apache Airflow version**:
1.10.14 and 1.10.8.
**Environment**:
Python 3.6 and Airflow 1.10.14 on sqllite,
**What happened**:
Nested jinja templates do not consistently render when running tasks. TI run rendering behavior also differs from airflow UI and airflow render cli.
**What you expected to happen**:
Airflow should render nested jinja templates consistently and completely across each interface. Coming from airflow 1.8.2, this used to be the case.
<!-- What do you think went wrong? -->
This regression may have been introduced in 1.10.6 with a refactor of BaseOperator templating functionality.
https://github.com/apache/airflow/pull/5461
Whether or not a nested layer renders seems to differ based on which arg is being templated in an operator and perhaps order. Furthermore, it seems like the render cli and airflow ui each apply TI.render_templates() a second time, creating inconsistency in what nested templates get rendered.
There may be bug in the way BaseOperator.render_template() observes/caches templated fields
**How to reproduce it**:
From the most basic airflow setup
nested_template_bug.py
```
from datetime import datetime
from airflow import DAG
from airflow.operators.python_operator import PythonOperator
with DAG("nested_template_bug", start_date=datetime(2021, 1, 1)) as dag:
arg0 = 'level_0_{{task.task_id}}_{{ds}}'
kwarg1 = 'level_1_{{task.op_args[0]}}'
def print_fields(arg0, kwarg1):
print(f'level 0 arg0: {arg0}')
print(f'level 1 kwarg1: {kwarg1}')
nested_render = PythonOperator(
task_id='nested_render',
python_callable=print_fields,
op_args=[arg0, ],
op_kwargs={
'kwarg1': kwarg1,
},
)
```
```
> airflow test c
level 0 arg0: level_0_nested_render_2021-01-01
level 1 kwarg1: level_1_level_0_{{task.task_id}}_{{ds}}
> airflow render nested_template_bug nested_render 2021-01-01
# ----------------------------------------------------------
# property: op_args
# ----------------------------------------------------------
['level_0_nested_render_2021-01-01']
# ----------------------------------------------------------
# property: op_kwargs
# ----------------------------------------------------------
{'kwarg1': 'level_1_level_0_nested_render_2021-01-01'}
``` | https://github.com/apache/airflow/issues/13559 | https://github.com/apache/airflow/pull/18516 | b0a29776b32cbee657c9a6369d15278a999e927f | 1ac63cd5e2533ce1df1ec1170418a09170998699 | "2021-01-08T04:06:45Z" | python | "2021-09-28T15:30:58Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,535 | ["airflow/providers/docker/CHANGELOG.rst", "airflow/providers/docker/operators/docker.py", "tests/providers/docker/operators/test_docker.py"] | DockerOperator / XCOM : `TypeError: Object of type bytes is not JSON serializable` | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**:
* local Ubuntu 18.04 LTS /
* docker-compose version 1.25.3, build d4d1b42b
* docker 20.10.1, build 831ebea
**What happened**:
when enabling xcom push for a docker operator the following error is thrown after the task finishes succesfully:
`TypeError: Object of type bytes is not JSON serializable`
**What you expected to happen**:
* error is not thrown
* if xcom_all is True: xcom contains all log lines
* if xcom_all is False: xcom contains last log line
**How to reproduce it**:
see docker compose and readme here:
https://github.com/AlessioM/airflow-xcom-issue
| https://github.com/apache/airflow/issues/13535 | https://github.com/apache/airflow/pull/13536 | 2de7793881da0968dd357a54e8b2a99017891915 | cd3307ff2147b170dc3feb5999edf5c8eebed4ba | "2021-01-07T09:22:20Z" | python | "2021-07-26T17:55:07Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,532 | ["airflow/providers/docker/operators/docker.py", "tests/providers/docker/operators/test_docker.py"] | In DockerOperator the parameter auto_remove doesn't work in | When setting DockerOperator with auto_remove=True in airflow version 2.0.0 the container remain in the container list if it was finished with 'Exited (1)' | https://github.com/apache/airflow/issues/13532 | https://github.com/apache/airflow/pull/13993 | 8eddc8b5019890a712810b8e5b1185997adb9bf4 | ba54afe58b7cbd3711aca23252027fbd034cca41 | "2021-01-07T07:48:37Z" | python | "2021-01-31T19:23:45Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,531 | ["airflow/api_connexion/endpoints/task_instance_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/task_instance_schema.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py"] | Airflow v1 REST List task instances api can not get `no_status` task instance | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
2.0
**Environment**:
- **OS** ubuntu 18.04
- **Kernel** 5.4.0-47-generic
**What happened**:
When I use list task instances REST api, I can not get the instances that status is `no_status`.
```
### Get Task Instances
POST {{baseUrl}}/dags/~/dagRuns/~/taskInstances/list
Authorization: Basic admin:xxx
Content-Type: application/json
{
"dag_ids": ["stop_dags"]
}
or
{
"dag_ids": ["stop_dags"],
"state": ["null"]
}
```
**What you expected to happen**:
include all state task instances when I don't have specific states.
<!-- What do you think went wrong? -->
**How to reproduce it**:
use REST test tools like postman to visit the api.
**Anything else we need to know**:
I can not find the REST api that I get all the dag runs instances with specific state, maybe should extend the REST api.
Thanks!
| https://github.com/apache/airflow/issues/13531 | https://github.com/apache/airflow/pull/19487 | 1e570229533c4bbf5d3c901d5db21261fa4b1137 | f636060fd7b5eb8facd1acb10a731d4e03bc864a | "2021-01-07T07:19:52Z" | python | "2021-11-20T16:09:33Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,515 | ["airflow/providers/slack/ADDITIONAL_INFO.md", "airflow/providers/slack/BACKPORT_PROVIDER_README.md", "airflow/providers/slack/README.md", "airflow/providers/slack/hooks/slack.py", "docs/conf.py", "docs/spelling_wordlist.txt", "scripts/ci/images/ci_verify_prod_image.sh", "setup.py", "tests/providers/slack/hooks/test_slack.py"] | Update slackapiclient / slack_sdk to v3 | Hello,
Slack has released updates to its library and we can start using it for it.
We especially like one change.
> slack_sdk has no required dependencies. This means aiohttp is no longer automatically resolved.
I've looked through the documentation and it doesn't look like a difficult task, but I think it's still worth testing.
More info: https://slack.dev/python-slack-sdk/v3-migration/index.html#from-slackclient-2-x
Best regards,
Kamil Breguła | https://github.com/apache/airflow/issues/13515 | https://github.com/apache/airflow/pull/13745 | dbd026227949a74e5995c8aef3c35bd80fc36389 | 283945001363d8f492fbd25f2765d39fa06d757a | "2021-01-06T12:56:13Z" | python | "2021-01-25T21:13:48Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,494 | ["airflow/providers/google/cloud/log/stackdriver_task_handler.py", "airflow/utils/log/log_reader.py", "tests/cli/commands/test_info_command.py", "tests/providers/google/cloud/log/test_stackdriver_task_handler.py", "tests/providers/google/cloud/log/test_stackdriver_task_handler_system.py"] | Unable to view StackDriver logs in Web UI | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.15-gke.4901
**Environment**:
- **Cloud provider or hardware configuration**: GKE
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**: Using the apache/airflow docker image
- **Others**: Running 1 pod encapsulating 2 containers (1 x webserver and 1x scheduler) running in localexecutor mode
**What happened**:
I have remote logging configured for tasks to send the logs to StackDriver as per the below configuration. The logs get sent to Stackdriver okay and I can view them via the GCP console. However I cannot view them when browsing the UI.
The UI shows a spinning wheel and I see requests in the network tab to
`https://my_airflow_instance/get_logs_with_metadata?dag_id=XXX......`
These requests take about 15 seconds to run before returning with HTTP 200 and something like this in the response body:
```
{"message":"","metadata":{"end_of_log":false,"next_page_token":"xxxxxxxxx"}}
```
So no actual log data
**What you expected to happen**:
I should see the logs in the Web UI
**How to reproduce it**:
Configure remote logging for StackDriver with the below config:
```
AIRFLOW__LOGGING__GOOGLE_KEY_PATH: "/var/run/secrets/airflow/secrets/google-cloud-platform/stackdriver/credentials.json"
AIRFLOW__LOGGING__LOG_FORMAT: "[%(asctime)s] {{%(filename)s:%(lineno)d}} %(levelname)s - %(message)s"
AIRFLOW__LOGGING__REMOTE_LOGGING: "True"
AIRFLOW__LOGGING__REMOTE_BASE_LOG_FOLDER: "stackdriver://airflow-tasks"
```
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/13494 | https://github.com/apache/airflow/pull/13784 | d65376c377341fa9d6da263e145e06880d4620a8 | 833e3383230e1f6f73f8022ddf439d3d531eff01 | "2021-01-05T17:47:14Z" | python | "2021-02-02T17:38:25Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,464 | ["airflow/models/dagrun.py"] | Scheduler fails if task is removed at runtime | **Apache Airflow version**: 2.0.0, LocalExecutor
**Environment**: Docker on Win10 with WSL, official Python3.8 image
**What happened**:
When a DAG is running, and I delete task from the running DAG, the scheduler fails. When using Docker, upon automatic restart of the scheduler, the scheduler just fails again, perpetually.

Note: I don't _know_ if the task itself was running at the time, but I would guess it was.
**What you expected to happen**:
The scheduler should understand that the task is not part of the DAG anymore and not fail.
**How to reproduce it**:
- Create a DAG with multiple tasks
- Let it run
- While running, delete one of the tasks from the source code
- See the scheduler break | https://github.com/apache/airflow/issues/13464 | https://github.com/apache/airflow/pull/14057 | d45739f7ce0de183329d67fff88a9da3943a9280 | eb78a8b86c6e372bbf4bfacb7628b154c16aa16b | "2021-01-04T16:54:58Z" | python | "2021-02-04T10:08:17Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,451 | ["airflow/providers/http/sensors/http.py", "tests/providers/http/sensors/test_http.py"] | Modify HttpSensor to continue poking if the response is not 404 | **Description**
As documented in the [HttpSensor](https://airflow.apache.org/docs/apache-airflow-providers-http/stable/_modules/airflow/providers/http/sensors/http.html) if the response for the HTTP call is an error different from "404" the task will Fail.
>HTTP Error codes other than 404 (like 403) or Connection Refused Error
> would fail the sensor itself directly (no more poking).
The code block that apply this behavior:
```
except AirflowException as exc:
if str(exc).startswith("404"):
return False
raise exc
```
**Use case / motivation**
I am working with an API that returns 500 for any error that happens internally (unauthorized, Not Acceptable, etc) and need the sensor be able to continue poking even the response is different from 404.
Another case's an API that sometimes returns 429 and makes the task fail. (Could solve with a large interval)
The first API has a bad design, but since we need to consume some services like this, I would like to have more flexibility when working with HttpSensor
**What do you want to happen**
When creating a HttpSensor task, I would like to be able to pass a list of status codes that will make the Sensor return "False" if the HTTP status code in the response match one code of the list to make the Sensor continue poking.
If no status code is set, the default to return False and continue poking will be 404 like is now.
**Are you willing to submit a PR?**
Yep! | https://github.com/apache/airflow/issues/13451 | https://github.com/apache/airflow/pull/13499 | 7a742cb03375a57291242131a27ffd4903bfdbd8 | 1602ec97c8d5bc7a7a8b42e850ac6c7a7030e47d | "2021-01-03T17:10:29Z" | python | "2021-01-20T00:02:08Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,434 | ["airflow/models/dag.py", "tests/jobs/test_scheduler_job.py"] | Airflow 2.0.0 manual run causes scheduled run to skip | **Apache Airflow version**: 2.0.0
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): N/A
**Environment**:
- **Cloud provider or hardware configuration**: local/aws
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04.5 LTS
- **Kernel** (e.g. `uname -a`): 5.4.0-1032-aws
- **Install tools**: pip
- **Others**:
**What happened**:
I did a fresh Airflow 2.0.0 install. With this version, when I manually trigger a DAG, Airflow skips the next scheduled run.
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
Manual runs do not interfere with the scheduled runs prior to Airflow 2.
<!-- What do you think went wrong? -->
**How to reproduce it**:
Create a simple hourly DAG. After enabling it and the initial run, run it manually. It shall skip the next hour. Below is an example, where the manual run with execution time of 08:17 causes the scheduled run with execution time of 08:00 to skip.

| https://github.com/apache/airflow/issues/13434 | https://github.com/apache/airflow/pull/13963 | 8e0db6eae371856597dce0ccf8a920b0107965cd | de277c69e7909cf0d563bbd542166397523ebbe0 | "2021-01-02T12:59:07Z" | python | "2021-01-30T12:02:53Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,414 | ["airflow/operators/trigger_dagrun.py", "tests/operators/test_trigger_dagrun.py"] | DAG raises error when passing non serializable JSON object via trigger | When passing a non serializable JSON object in a trigger, I get the following error below. The logs become unavailable.
my code:
```py
task_trigger_ad_attribution = TriggerDagRunOperator(
task_id='trigger_ad_attribution',
trigger_dag_id=AD_ATTRIBUTION_DAG_ID,
conf={"message": "Triggered from display trigger",
'trigger_info':
{'dag_id':DAG_ID,
'now':datetime.datetime.now(),
},
'trigger_date' : '{{execution_date}}'
},
)
```
```
Ooops!
Something bad has happened.
Please consider letting us know by creating a bug report using GitHub.
Python version: 3.6.9
Airflow version: 2.0.0
Node: henry-Inspiron-5566
-------------------------------------------------------------------------------
Traceback (most recent call last):
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 2447, in wsgi_app
response = self.full_dispatch_request()
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1952, in full_dispatch_request
rv = self.handle_user_exception(e)
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1821, in handle_user_exception
reraise(exc_type, exc_value, tb)
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/_compat.py", line 39, in reraise
raise value
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1950, in full_dispatch_request
rv = self.dispatch_request()
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/flask/app.py", line 1936, in dispatch_request
return self.view_functions[rule.endpoint](**req.view_args)
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/auth.py", line 34, in decorated
return func(*args, **kwargs)
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/decorators.py", line 97, in view_func
return f(*args, **kwargs)
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/decorators.py", line 60, in wrapper
return f(*args, **kwargs)
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/airflow/www/views.py", line 1997, in tree
data = htmlsafe_json_dumps(data, separators=(',', ':'))
File "/home/henry/Envs2/airflow3/lib/python3.6/site-packages/jinja2/utils.py", line 614, in htmlsafe_json_dumps
dumper(obj, **kwargs)
File "/usr/lib/python3.6/json/__init__.py", line 238, in dumps
**kw).encode(obj)
File "/usr/lib/python3.6/json/encoder.py", line 199, in encode
chunks = self.iterencode(o, _one_shot=True)
File "/usr/lib/python3.6/json/encoder.py", line 257, in iterencode
return _iterencode(o, 0)
File "/usr/lib/python3.6/json/encoder.py", line 180, in default
o.__class__.__name__)
TypeError: Object of type 'datetime' is not JSON serializable
``` | https://github.com/apache/airflow/issues/13414 | https://github.com/apache/airflow/pull/13964 | 862443f6d3669411abfb83082c29c2fad7fcf12d | b4885b25871ae7ede2028f81b0d88def3e22f23a | "2020-12-31T20:51:45Z" | python | "2021-01-29T16:24:46Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,376 | ["airflow/cli/commands/sync_perm_command.py", "tests/cli/commands/test_sync_perm_command.py"] | airflow sync-perm command does not sync DAG level Access Control | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0
**What happened**:
Running sync_perm CLI command does not synchronize the permission granted through the DAG via access_control.
This is because of dag serialization. When dag serialization is enabled, the dagbag will exhibit a lazy loading behaviour.
**How to reproduce it**:
1. Add access_control to a DAG where the new role has permission to see the DAG.
```
access_control={
"test": {'can_dag_read'}
},
```
4. Run `airflow sync-perm`.
5. Log in as the new user and you will still not see any DAG.
6. If you refresh the DAG, the new user will be able to DAG after they refresh their page
**Expected behavior**
When I run `airflow sync-perm`, I expect the role who has been granted read permission for the DAG to be able to see that DAG.
This is also an issue with 1.10.x with DAG Serialization enabled, so would be good to backport it too.
| https://github.com/apache/airflow/issues/13376 | https://github.com/apache/airflow/pull/13377 | d5cf993f81ea2c4b5abfcb75ef05a6f3783874f2 | 1b94346fbeca619f3084d05bdc5358836ed02318 | "2020-12-29T23:33:44Z" | python | "2020-12-30T11:35:45Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,360 | ["airflow/providers/amazon/aws/transfers/mongo_to_s3.py", "tests/providers/amazon/aws/transfers/test_mongo_to_s3.py"] | Add 'mongo_collection' to template_fields in MongoToS3Operator | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
<!-- A short description of your feature -->
Make `MongoToS3Operator` `mongo_collection` parameter templated.
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
This would allow for passing a templated mongo collection from other tasks, such as a mongo collection used as data destination by using `S3Hook`. For instance, we could use templated mongo collection to write data for different dates in different collections by using: `mycollection.{{ ds_nodash }}`.
**Are you willing to submit a PR?**
<!--- We accept contributions! -->
Yes.
**Related Issues**
<!-- Is there currently another issue associated with this? -->
N/A
| https://github.com/apache/airflow/issues/13360 | https://github.com/apache/airflow/pull/13361 | e43688358320a5f20776c0d346c310a568a55049 | f7a1334abe4417409498daad52c97d3f0eb95137 | "2020-12-29T11:42:55Z" | python | "2021-01-02T10:32:07Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,325 | ["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"] | max_tis_per_query=0 leads to nothing being scheduled in 2.0.0 | After upgrading to airflow 2.0.0 it seems as if the scheduler isn't working anymore. Tasks hang on scheduled state, but no tasks get executed. I've tested this with sequential and celery executor. When using the celery executor no messages seem to arrive in RabbiyMq
This is on local docker. Everything was working fine before upgrading. There don't seem to be any error messages, so I'm not completely sure if this is a bug or a misconfiguration on my end.
Using python:3.7-slim-stretch Docker image. Regular setup that we're using is CeleryExecutor. Mysql version is 5.7
Any help would be greatly appreciated.
**Python packages**
alembic==1.4.3
altair==4.1.0
amazon-kclpy==1.5.0
amqp==2.6.1
apache-airflow==2.0.0
apache-airflow-providers-amazon==1.0.0
apache-airflow-providers-celery==1.0.0
apache-airflow-providers-ftp==1.0.0
apache-airflow-providers-http==1.0.0
apache-airflow-providers-imap==1.0.0
apache-airflow-providers-jdbc==1.0.0
apache-airflow-providers-mysql==1.0.0
apache-airflow-providers-sqlite==1.0.0
apache-airflow-upgrade-check==1.1.0
apispec==3.3.2
appdirs==1.4.4
argcomplete==1.12.2
argon2-cffi==20.1.0
asn1crypto==1.4.0
async-generator==1.10
attrs==20.3.0
azure-common==1.1.26
azure-core==1.9.0
azure-storage-blob==12.6.0
Babel==2.9.0
backcall==0.2.0
bcrypt==3.2.0
billiard==3.6.3.0
black==20.8b1
bleach==3.2.1
boa-str==1.1.0
boto==2.49.0
boto3==1.7.3
botocore==1.10.84
cached-property==1.5.2
cattrs==1.1.2
cbsodata==1.3.3
celery==4.4.2
certifi==2020.12.5
cffi==1.14.4
chardet==3.0.4
click==7.1.2
clickclick==20.10.2
cmdstanpy==0.9.5
colorama==0.4.4
colorlog==4.0.2
commonmark==0.9.1
connexion==2.7.0
convertdate==2.3.0
coverage==4.2
croniter==0.3.36
cryptography==3.3.1
cycler==0.10.0
Cython==0.29.21
decorator==4.4.2
defusedxml==0.6.0
dill==0.3.3
dnspython==2.0.0
docutils==0.14
email-validator==1.1.2
entrypoints==0.3
ephem==3.7.7.1
et-xmlfile==1.0.1
fbprophet==0.7.1
fire==0.3.1
Flask==1.1.2
Flask-AppBuilder==3.1.1
Flask-Babel==1.0.0
Flask-Bcrypt==0.7.1
Flask-Caching==1.9.0
Flask-JWT-Extended==3.25.0
Flask-Login==0.4.1
Flask-OpenID==1.2.5
Flask-SQLAlchemy==2.4.4
flask-swagger==0.2.13
Flask-WTF==0.14.3
flatten-json==0.1.7
flower==0.9.5
funcsigs==1.0.2
future==0.18.2
graphviz==0.15
great-expectations==0.13.2
gunicorn==19.10.0
holidays==0.10.4
humanize==3.2.0
idna==2.10
importlib-metadata==1.7.0
importlib-resources==1.5.0
inflection==0.5.1
ipykernel==5.4.2
ipython==7.19.0
ipython-genutils==0.2.0
ipywidgets==7.5.1
iso8601==0.1.13
isodate==0.6.0
itsdangerous==1.1.0
JayDeBeApi==1.2.3
jdcal==1.4.1
jedi==0.17.2
jellyfish==0.8.2
Jinja2==2.11.2
jmespath==0.10.0
joblib==1.0.0
JPype1==1.2.0
json-merge-patch==0.2
jsonpatch==1.28
jsonpointer==2.0
jsonschema==3.2.0
jupyter-client==6.1.7
jupyter-core==4.7.0
jupyterlab-pygments==0.1.2
kinesis-events==0.1.0
kiwisolver==1.3.1
kombu==4.6.11
korean-lunar-calendar==0.2.1
lazy-object-proxy==1.4.3
lockfile==0.12.2
LunarCalendar==0.0.9
Mako==1.1.3
Markdown==3.3.3
MarkupSafe==1.1.1
marshmallow==3.10.0
marshmallow-enum==1.5.1
marshmallow-oneofschema==2.0.1
marshmallow-sqlalchemy==0.23.1
matplotlib==3.3.3
mistune==0.8.4
mock==1.0.1
mockito==1.2.2
msrest==0.6.19
mypy-extensions==0.4.3
mysql-connector-python==8.0.18
mysqlclient==2.0.2
natsort==7.1.0
nbclient==0.5.1
nbconvert==6.0.7
nbformat==5.0.8
nest-asyncio==1.4.3
nose==1.3.7
notebook==6.1.5
numpy==1.19.4
oauthlib==3.1.0
openapi-spec-validator==0.2.9
openpyxl==3.0.5
oscrypto==1.2.1
packaging==20.8
pandas==1.1.5
pandocfilters==1.4.3
parso==0.7.1
pathspec==0.8.1
pendulum==2.1.2
pexpect==4.8.0
phonenumbers==8.12.15
pickleshare==0.7.5
Pillow==8.0.1
prison==0.1.3
prometheus-client==0.8.0
prompt-toolkit==3.0.8
protobuf==3.14.0
psutil==5.8.0
ptyprocess==0.6.0
pyarrow==2.0.0
pycodestyle==2.6.0
pycparser==2.20
pycryptodomex==3.9.9
pydevd-pycharm==193.5233.109
Pygments==2.7.3
PyJWT==1.7.1
PyMeeus==0.3.7
pyodbc==4.0.30
pyOpenSSL==19.1.0
pyparsing==2.4.7
pyrsistent==0.17.3
pystan==2.19.1.1
python-crontab==2.5.1
python-daemon==2.2.4
python-dateutil==2.8.1
python-editor==1.0.4
python-nvd3==0.15.0
python-slugify==4.0.1
python3-openid==3.2.0
pytz==2019.3
pytzdata==2020.1
PyYAML==5.3.1
pyzmq==20.0.0
recordlinkage==0.14
regex==2020.11.13
requests==2.23.0
requests-oauthlib==1.3.0
rich==9.2.0
ruamel.yaml==0.16.12
ruamel.yaml.clib==0.2.2
s3transfer==0.1.13
scikit-learn==0.23.2
scipy==1.5.4
scriptinep3==0.3.1
Send2Trash==1.5.0
setproctitle==1.2.1
setuptools-git==1.2
shelljob==0.5.6
six==1.15.0
sklearn==0.0
snowflake-connector-python==2.3.7
snowflake-sqlalchemy==1.2.4
SQLAlchemy==1.3.22
SQLAlchemy-JSONField==1.0.0
SQLAlchemy-Utils==0.36.8
swagger-ui-bundle==0.0.8
tabulate==0.8.7
TagValidator==0.0.8
tenacity==6.2.0
termcolor==1.1.0
terminado==0.9.1
testpath==0.4.4
text-unidecode==1.3
threadpoolctl==2.1.0
thrift==0.13.0
toml==0.10.2
toolz==0.11.1
tornado==6.1
tqdm==4.54.1
traitlets==5.0.5
typed-ast==1.4.1
typing-extensions==3.7.4.3
tzlocal==1.5.1
unicodecsv==0.14.1
urllib3==1.24.2
validate-email==1.3
vine==1.3.0
watchtower==0.7.3
wcwidth==0.2.5
webencodings==0.5.1
Werkzeug==1.0.1
widgetsnbextension==3.5.1
wrapt==1.12.1
WTForms==2.3.1
xlrd==2.0.1
XlsxWriter==1.3.7
zipp==3.4.0
**Relevant config**
```
# The folder where your airflow pipelines live, most likely a
# subfolder in a code repositories
# This path must be absolute
dags_folder = /usr/local/airflow/dags
# The executor class that airflow should use. Choices include
# SequentialExecutor, LocalExecutor, CeleryExecutor, DaskExecutor
executor = CeleryExecutor
# The SqlAlchemy connection string to the metadata database.
# SqlAlchemy supports many different database engine, more information
# their website
sql_alchemy_conn = db+mysql://airflow:airflow@postgres/airflow
# The SqlAlchemy pool size is the maximum number of database connections
# in the pool.
sql_alchemy_pool_size = 5
# The SqlAlchemy pool recycle is the number of seconds a connection
# can be idle in the pool before it is invalidated. This config does
# not apply to sqlite.
sql_alchemy_pool_recycle = 3600
# The amount of parallelism as a setting to the executor. This defines
# the max number of task instances that should run simultaneously
# on this airflow installation
parallelism = 32
# The number of task instances allowed to run concurrently by the scheduler
dag_concurrency = 16
# Are DAGs paused by default at creation
dags_are_paused_at_creation = True
# When not using pools, tasks are run in the "default pool",
# whose size is guided by this config element
non_pooled_task_slot_count = 128
# The maximum number of active DAG runs per DAG
max_active_runs_per_dag = 16
# How long before timing out a python file import while filling the DagBag
dagbag_import_timeout = 60
# The class to use for running task instances in a subprocess
task_runner = StandardTaskRunner
# Whether to enable pickling for xcom (note that this is insecure and allows for
# RCE exploits). This will be deprecated in Airflow 2.0 (be forced to False).
enable_xcom_pickling = True
# When a task is killed forcefully, this is the amount of time in seconds that
# it has to cleanup after it is sent a SIGTERM, before it is SIGKILLED
killed_task_cleanup_time = 60
# This flag decides whether to serialise DAGs and persist them in DB. If set to True, Webserver reads from DB instead of parsing DAG files
store_dag_code = True
# You can also update the following default configurations based on your needs
min_serialized_dag_update_interval = 30
min_serialized_dag_fetch_interval = 10
[celery]
# This section only applies if you are using the CeleryExecutor in
# [core] section above
# The app name that will be used by celery
celery_app_name = airflow.executors.celery_executor
# The concurrency that will be used when starting workers with the
# "airflow worker" command. This defines the number of task instances that
# a worker will take, so size up your workers based on the resources on
# your worker box and the nature of your tasks
worker_concurrency = 16
# When you start an airflow worker, airflow starts a tiny web server
# subprocess to serve the workers local log files to the airflow main
# web server, who then builds pages and sends them to users. This defines
# the port on which the logs are served. It needs to be unused, and open
# visible from the main web server to connect into the workers.
worker_log_server_port = 8793
# The Celery broker URL. Celery supports RabbitMQ, Redis and experimentally
# a sqlalchemy database. Refer to the Celery documentation for more
# information.
broker_url = amqp://amqp:5672/1
# Another key Celery setting
result_backend = db+mysql://airflow:airflow@postgres/airflow
# Celery Flower is a sweet UI for Celery. Airflow has a shortcut to start
# it `airflow flower`. This defines the IP that Celery Flower runs on
flower_host = 0.0.0.0
# This defines the port that Celery Flower runs on
flower_port = 5555
# Default queue that tasks get assigned to and that worker listen on.
default_queue = airflow
# Import path for celery configuration options
celery_config_options = airflow.config_templates.default_celery.DEFAULT_CELERY_CONFIG
# No SSL
ssl_active = False
[scheduler]
# Task instances listen for external kill signal (when you clear tasks
# from the CLI or the UI), this defines the frequency at which they should
# listen (in seconds).
job_heartbeat_sec = 5
# The scheduler constantly tries to trigger new tasks (look at the
# scheduler section in the docs for more information). This defines
# how often the scheduler should run (in seconds).
scheduler_heartbeat_sec = 5
# after how much time should the scheduler terminate in seconds
# -1 indicates to run continuously (see also num_runs)
run_duration = -1
# after how much time a new DAGs should be picked up from the filesystem
min_file_process_interval = 60
use_row_level_locking=False
dag_dir_list_interval = 300
# How often should stats be printed to the logs
print_stats_interval = 30
child_process_log_directory = /usr/local/airflow/logs/scheduler
# Local task jobs periodically heartbeat to the DB. If the job has
# not heartbeat in this many seconds, the scheduler will mark the
# associated task instance as failed and will re-schedule the task.
scheduler_zombie_task_threshold = 300
# Turn off scheduler catchup by setting this to False.
# Default behavior is unchanged and
# Command Line Backfills still work, but the scheduler
# will not do scheduler catchup if this is False,
# however it can be set on a per DAG basis in the
# DAG definition (catchup)
catchup_by_default = True
# This changes the batch size of queries in the scheduling main loop.
# This depends on query length limits and how long you are willing to hold locks.
# 0 for no limit
max_tis_per_query = 0
# The scheduler can run multiple threads in parallel to schedule dags.
# This defines how many threads will run.
parsing_processes = 4
authenticate = False
``` | https://github.com/apache/airflow/issues/13325 | https://github.com/apache/airflow/pull/13512 | b103a1dd0e22b67dcc8cb2a28a5afcdfb7554412 | 31d31adb58750d473593a9b13c23afcc9a0adf97 | "2020-12-27T10:25:52Z" | python | "2021-01-18T21:24:37Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,306 | ["BREEZE.rst", "Dockerfile", "Dockerfile.ci", "scripts/ci/images/ci_verify_prod_image.sh", "scripts/ci/libraries/_initialization.sh", "setup.py"] | The "ldap" extra misses libldap dependency | The 'ldap' provider misses 'ldap' extra dep (which adds ldap3 pip dependency). | https://github.com/apache/airflow/issues/13306 | https://github.com/apache/airflow/pull/13308 | 13a9747bf1d92020caa5d4dc825e096ce583f2df | d23ac9b235c5b30a5d2d3a3a7edf60e0085d68de | "2020-12-24T18:21:48Z" | python | "2020-12-28T16:07:00Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,295 | ["airflow/models/dag.py", "tests/models/test_dag.py"] | In triggered SubDag (schedule_interval=None), when clearing a successful Subdag, child tasks aren't run | **Apache Airflow version**:
Airflow 2.0
**Environment**:
Ubuntu 20.04 (WSL on Windows 10)
- **OS** (e.g. from /etc/os-release):
VERSION="20.04.1 LTS (Focal Fossa)"
- **Kernel** (e.g. `uname -a`):
Linux XXX 4.19.128-microsoft-standard #1 SMP Tue Jun 23 12:58:10 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
**What happened**:
After successfully running a SUBDAG, clearing it (including downstream+recursive) doesn't trigger the inner tasks. Instead, the subdag is marked successful and the inner tasks all stay cleared and aren't re-run.
**What you expected to happen**:
Expected Clear with DownStream + Recursive to re-run all subdag tasks.
<!-- What do you think went wrong? -->
**How to reproduce it**:
1. Using a slightly modified version of https://airflow.apache.org/docs/apache-airflow/stable/concepts.html#subdags:
```python
from airflow import DAG
from airflow.example_dags.subdags.subdag import subdag
from airflow.operators.dummy import DummyOperator
from airflow.operators.subdag import SubDagOperator
from airflow.utils.dates import days_ago
def subdag(parent_dag_name, child_dag_name, args):
dag_subdag = DAG(
dag_id=f'{parent_dag_name}.{child_dag_name}',
default_args=args,
start_date=days_ago(2),
schedule_interval=None,
)
for i in range(5):
DummyOperator(
task_id='{}-task-{}'.format(child_dag_name, i + 1),
default_args=args,
dag=dag_subdag,
)
return dag_subdag
DAG_NAME = 'example_subdag_operator'
args = {
'owner': 'airflow',
}
dag = DAG(
dag_id=DAG_NAME, default_args=args, start_date=days_ago(2), schedule_interval=None, tags=['example']
)
start = DummyOperator(
task_id='start',
dag=dag,
)
section_1 = SubDagOperator(
task_id='section-1',
subdag=subdag(DAG_NAME, 'section-1', args),
dag=dag,
)
some_other_task = DummyOperator(
task_id='some-other-task',
dag=dag,
)
section_2 = SubDagOperator(
task_id='section-2',
subdag=subdag(DAG_NAME, 'section-2', args),
dag=dag,
)
end = DummyOperator(
task_id='end',
dag=dag,
)
start >> section_1 >> some_other_task >> section_2 >> end
```
2. Run the subdag fully.
3. Clear (with recursive/downstream) any of the SubDags.
4. The Subdag will be marked successful, but if you zoom into the subdag, you'll see all the child tasks were not run.
| https://github.com/apache/airflow/issues/13295 | https://github.com/apache/airflow/pull/14776 | 0b50e3228519138c9826bc8e98f0ab5dc40a268d | 052163516bf91ab7bb53f4ec3c7b5621df515820 | "2020-12-24T01:51:24Z" | python | "2021-03-18T10:38:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,226 | ["UPDATING.md"] | Use of SQLAInterface in custom models in Plugins | We might need to add to Airflow 2.0 upgrade documentation the need to use `CustomSQLAInterface` instead of `SQLAInterface`.
If you want to define your own appbuilder models you need to change the interface to a Custom one:
Non-RBAC replace:
```
from flask_appbuilder.models.sqla.interface import SQLAInterface
datamodel = SQLAInterface(your_data_model)
```
with RBAC (in 1.10):
```
from airflow.www_rbac.utils import CustomSQLAInterface
datamodel = CustomSQLAInterface(your_data_model)
```
and in 2.0:
```
from airflow.www.utils import CustomSQLAInterface
datamodel = CustomSQLAInterface(your_data_model)
```
| https://github.com/apache/airflow/issues/13226 | https://github.com/apache/airflow/pull/14478 | 0a969db2b025709505f8043721c83218a73bb84d | 714a07542c2560b50d013d66f71ad9a209dd70b6 | "2020-12-21T17:40:47Z" | python | "2021-03-03T00:29:54Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,225 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/task_instance_schema.py", "airflow/models/dag.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py", "tests/api_connexion/schemas/test_task_instance_schema.py"] | Clear Tasks via the stable REST API with task_id filter | **Description**
I have noticed that the stable REST API doesn't have the ability to run a task (which is possible from the airflow web interface.
I think it would be nice to have either:
- Run task
- Run all failing tasks (rerun from point of failure)
this ability for integrations.
**Use case / motivation**
I would like the ability to identify the failing tasks on a specific DAG Run and rerun only them.
I would like to do it remotely (non-interactive) using the REST API.
I could write a script that run only the failing tasks, but I couldn't find a way to "Run" a task, when I have the task instance ID.
**Are you willing to submit a PR?**
Not at this stage
**Related Issues**
| https://github.com/apache/airflow/issues/13225 | https://github.com/apache/airflow/pull/14500 | a265fd54792bb7638188eaf4f6332ae95d24899e | e150bbfe0a7474308ba7df9c89e699b77c45bb5c | "2020-12-21T17:38:56Z" | python | "2021-04-07T06:54:34Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,214 | ["airflow/migrations/versions/2c6edca13270_resource_based_permissions.py"] | Make migration logging consistent | **Apache Airflow version**:
2.0.0.dev
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
When I run `airflow db reset -y` I got
```
INFO [alembic.runtime.migration] Running upgrade bef4f3d11e8b -> 98271e7606e2, Add scheduling_decision to DagRun and DAG
INFO [alembic.runtime.migration] Running upgrade 98271e7606e2 -> 52d53670a240, fix_mssql_exec_date_rendered_task_instance_fields_for_MSSQL
INFO [alembic.runtime.migration] Running upgrade 52d53670a240 -> 364159666cbd, Add creating_job_id to DagRun table
INFO [alembic.runtime.migration] Running upgrade 364159666cbd -> 45ba3f1493b9, add-k8s-yaml-to-rendered-templates
INFO [alembic.runtime.migration] Running upgrade 45ba3f1493b9 -> 849da589634d, Prefix DAG permissions.
INFO [alembic.runtime.migration] Running upgrade 849da589634d -> 2c6edca13270, Resource based permissions.
[2020-12-21 10:15:40,510] {manager.py:727} WARNING - No user yet created, use flask fab command to do it.
[2020-12-21 10:15:41,964] {providers_manager.py:291} WARNING - Exception when importing 'airflow.providers.google.cloud.hooks.compute_ssh.ComputeEngineSSHHook' from 'apache-airflow-providers-google' package: No module named 'google.cloud.oslogin_v1'
[2020-12-21 10:15:42,791] {providers_manager.py:291} WARNING - Exception when importing 'airflow.providers.google.cloud.hooks.compute_ssh.ComputeEngineSSHHook' from 'apache-airflow-providers-google' package: No module named 'google.cloud.oslogin_v1'
[2020-12-21 10:15:47,157] {migration.py:515} INFO - Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection
[2020-12-21 10:15:47,160] {migration.py:515} INFO - Running upgrade 61ec73d9401f -> 64a7d6477aae, fix description field in connection to be text
[2020-12-21 10:15:47,164] {migration.py:515} INFO - Running upgrade 64a7d6477aae -> e959f08ac86c, Change field in DagCode to MEDIUMTEXT for MySql
[2020-12-21 10:15:47,381] {dagbag.py:440} INFO - Filling up the DagBag from /root/airflow/dags
[2020-12-21 10:15:47,857] {dag.py:1813} INFO - Sync 29 DAGs
[2020-12-21 10:15:47,870] {dag.py:1832} INFO - Creating ORM DAG for example_bash_operator
[2020-12-21 10:15:47,871] {dag.py:1832} INFO - Creating ORM DAG for example_kubernetes_executor
[2020-12-21 10:15:47,872] {dag.py:1832} INFO - Creating ORM DAG for example_xcom_args
[2020-12-21 10:15:47,873] {dag.py:1832} INFO - Creating ORM DAG for tutorial
[2020-12-21 10:15:47,873] {dag.py:1832} INFO - Creating ORM DAG for example_python_operator
[2020-12-21 10:15:47,874] {dag.py:1832} INFO - Creating ORM DAG for example_xcom
```
**What you expected to happen**:
I expect to see all migration logging to be formatted in the same style. I would also love to see no unrelated logs - this will make `db reset` easier to digest.
**How to reproduce it**:
Run `airflow db reset -y`
**Anything else we need to know**:
N/A
| https://github.com/apache/airflow/issues/13214 | https://github.com/apache/airflow/pull/13458 | feb84057d34b2f64e3b5dcbaae2d3b18f5f564e4 | 43b2d3392224d8e0d6fb8ce8cdc6b0f0b0cc727b | "2020-12-21T10:21:14Z" | python | "2021-01-04T17:25:02Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,200 | ["airflow/utils/cli.py", "tests/utils/test_cli_util.py"] | CLI `airflow scheduler -D --pid <PIDFile>` fails silently if PIDFile given is a relative path |
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0
**Environment**: Linux & MacOS, venv
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04.3 LTS / MacOS 10.15.7
- **Kernel** (e.g. `uname -a`):
- Linux *** 5.4.0-1029-aws #30~18.04.1-Ubuntu SMP Tue Oct 20 11:09:25 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- Darwin *** 19.6.0 Darwin Kernel Version 19.6.0: Thu Oct 29 22:56:45 PDT 2020; root:xnu-6153.141.2.2~1/RELEASE_X86_64 x86_64
**What happened**:
Say I'm in my home dir, running command `airflow scheduler -D --pid test.pid` (`test.pid` is a relative path) is supposed to start the scheduler in daemon mode, and the PID will be stored in the file `test.pid` (if it doesn't exist, it should be created).
However, the scheduler is NOT started. This can be validated by running `ps aux | grep airflow | grep scheduler` (no process is shown). In the whole process, I don't see any error message.
However, if I change the pid file path to an absolute path, i.e. `airflow scheduler -D --pid ${PWD}/test.pid`, it successfully start the scheduler in daemon mode (can be validated via the method above).
**What you expected to happen**:
Even if the PID file path provided is a relative path, the scheduler should be started properly as well.
<!-- What do you think went wrong? -->
**How to reproduce it**:
Described above
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/13200 | https://github.com/apache/airflow/pull/13232 | aa00e9bcd4ec16f42338b30d29e87ccda8eecf82 | 93e4787b70a85cc5f13db5e55ef0c06629b45e6e | "2020-12-20T22:16:54Z" | python | "2020-12-22T22:18:38Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,192 | ["airflow/providers/google/cloud/operators/mlengine.py", "tests/providers/google/cloud/operators/test_mlengine.py"] | Generalize MLEngineStartTrainingJobOperator to custom images | **Description**
The operator is arguably unnecessarily limited to AI Platform’s standard images. The only change that is required to lift this constraint is making `package_uris` and `training_python_module` optional with default values `[]` and `None`, respectively. Then, using `master_config`, one can supply `imageUri` and run any image of choice.
**Use case / motivation**
This will open up for running arbitrary images on AI Platform.
**Are you willing to submit a PR?**
If the above sounds reasonable, I can open pull requests. | https://github.com/apache/airflow/issues/13192 | https://github.com/apache/airflow/pull/13318 | 6e1a6ff3c8a4f8f9bcf8b7601362359bfb2be6bf | f6518dd6a1217d906d863fe13dc37916efd78b3e | "2020-12-20T10:26:37Z" | python | "2021-01-02T10:34:04Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,181 | ["chart/templates/workers/worker-kedaautoscaler.yaml", "chart/tests/helm_template_generator.py", "chart/tests/test_keda.py"] | keda scaledobject not created even though keda enabled in helm config | In brand new cluster using k3d locally, I first installed keda:
```bash
helm install keda \
--namespace keda kedacore/keda \
--version "v1.5.0"
```
Next, I installed airflow using this config:
```yaml
executor: CeleryExecutor
defaultAirflowTag: 2.0.0-python3.7
airflowVersion: 2.0.0
workers:
keda:
enabled: true
persistence:
enabled: false
pgbouncer:
enabled: true
```
I think this should create a scaled object `airflow-worker`.
But it does not.
@turbaszek and @dimberman you may have insight ...
| https://github.com/apache/airflow/issues/13181 | https://github.com/apache/airflow/pull/13183 | 4aba9c5a8b89d2827683fb4c84ac481c89ebc2b3 | a9d562e1c3c16c98750c9e3be74347f882acb97a | "2020-12-19T08:30:36Z" | python | "2020-12-21T10:19:26Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,081 | ["docs/apache-airflow/upgrading-to-2.rst"] | OAuth2 login process is not stateless | **Apache Airflow version**: 1.10.14
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Server Version: version.Info{Major:"1", Minor:"16+", GitVersion:"v1.16.15-eks-ad4801", GitCommit:"ad4801fd44fe0f125c8d13f1b1d4827e8884476d", GitTreeState:"clean", BuildDate:"2020-10-20T23:27:12Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}
**Environment**:
- **Cloud provider or hardware configuration**: AWS / EKS
- **OS** (e.g. from /etc/os-release): N/A
- **Kernel** (e.g. `uname -a`): N/A
- **Install tools**: N/A
- **Others**: N/A
**What happened**:
Cognito login does not work if second request is not handled by first pod receiving access_token headers.
**What you expected to happen**:
Logging in via Cognito OAuth2 mode / Code should work via any pod.
**How to reproduce it**:
Override `webserver_config.py` with the following code:
```
"""Default configuration for the Airflow webserver"""
import logging
import os
import json
from airflow.configuration import conf
from airflow.www_rbac.security import AirflowSecurityManager
from flask_appbuilder.security.manager import AUTH_OAUTH
log = logging.getLogger(__name__)
basedir = os.path.abspath(os.path.dirname(__file__))
# The SQLAlchemy connection string.
SQLALCHEMY_DATABASE_URI = conf.get('core', 'SQL_ALCHEMY_CONN')
# Flask-WTF flag for CSRF
WTF_CSRF_ENABLED = True
CSRF_ENABLED = True
# ----------------------------------------------------
# AUTHENTICATION CONFIG
# ----------------------------------------------------
# For details on how to set up each of the following authentication, see
# http://flask-appbuilder.readthedocs.io/en/latest/security.html# authentication-methods
# for details.
# The authentication type
AUTH_TYPE = AUTH_OAUTH
SECRET_KEY = os.environ.get("FLASK_SECRET_KEY")
OAUTH_PROVIDERS = [{
'name': 'aws_cognito',
'whitelist': ['@ga.gov.au'],
'token_key': 'access_token',
'icon': 'fa-amazon',
'remote_app': {
'api_base_url': os.environ.get("OAUTH2_BASE_URL") + "/",
'client_kwargs': {
'scope': 'openid email aws.cognito.signin.user.admin'
},
'authorize_url': os.environ.get("OAUTH2_BASE_URL") + "/authorize",
'access_token_url': os.environ.get("OAUTH2_BASE_URL") + "/token",
'request_token_url': None,
'client_id': os.environ.get("COGNITO_CLIENT_ID"),
'client_secret': os.environ.get("COGNITO_CLIENT_SECRET"),
}
}]
class CognitoAirflowSecurityManager(AirflowSecurityManager):
def oauth_user_info(self, provider, resp):
# log.info("Requesting user info from AWS Cognito: {0}".format(resp))
assert provider == "aws_cognito"
# log.info("Requesting user info from AWS Cognito: {0}".format(resp))
me = self.appbuilder.sm.oauth_remotes[provider].get("userInfo")
return {
"username": me.json().get("username"),
"email": me.json().get("email"),
"first_name": me.json().get("given_name", ""),
"last_name": me.json().get("family_name", ""),
"id": me.json().get("sub", ""),
}
SECURITY_MANAGER_CLASS = CognitoAirflowSecurityManager
```
- Setup an airflow-app linked a to Cognito user pull and run multiple replicas of the airflow-web pod.
- Login will start failing and work may be 1 in 9 attempts.
**Anything else we need to know**:
There are 3 possible work arounds using infrastructure changes instead of airflow-web code changes.
- Use a single pod for airflow-web to avoid session issues
- Make ALB sticky via ingress to give users the same pod consistently
- Sharing the same secret key across all airflow-web pods using the environment
| https://github.com/apache/airflow/issues/13081 | https://github.com/apache/airflow/pull/13094 | 484f95f55cda4ca4fd3157135199623c9e37cc8a | 872350bac5bebea09bd52d50734a3b7517af712c | "2020-12-15T06:41:18Z" | python | "2020-12-21T23:26:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,053 | ["airflow/utils/dot_renderer.py", "tests/utils/test_dot_renderer.py"] | CLI does not display TaskGroups | Hello,
Airflow ability to [display DAG in CLI](http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/usage-cli.html#display-dags-structure) with command `airflow dags show`, but unfortunately this command does not display Task Groups. It would be great if the Task Groups were correctly marked in the diagrams.
<img width="1268" alt="Screenshot 2020-12-14 at 02 28 58" src="https://user-images.githubusercontent.com/12058428/102030893-9f3e4d00-3db4-11eb-8c2d-f33e38d01997.png">
<img width="681" alt="Screenshot 2020-12-14 at 02 29 16" src="https://user-images.githubusercontent.com/12058428/102030898-a2d1d400-3db4-11eb-9b31-0cde70fea675.png">
Best regards,
Kamil Breguła | https://github.com/apache/airflow/issues/13053 | https://github.com/apache/airflow/pull/14269 | 21f297425ae85ce89e21477d55b51d5560f47bf8 | c71f707d24a9196d33b91a7a2a9e3384698e5193 | "2020-12-14T01:34:50Z" | python | "2021-02-25T15:23:15Z" |
closed | apache/airflow | https://github.com/apache/airflow | 13,027 | ["MANIFEST.in"] | No such file or directory: '/usr/local/lib/python3.9/site-packages/airflow/customized_form_field_behaviours.schema.json' | v2.0.0rc1
```
airflow db init
DB: sqlite:////Users/red/airflow/airflow.db
[2020-12-12 00:33:02,036] {db.py:678} INFO - Creating tables
INFO [alembic.runtime.migration] Context impl SQLiteImpl.
INFO [alembic.runtime.migration] Will assume non-transactional DDL.
Traceback (most recent call last):
File "/usr/local/bin/airflow", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python3.9/site-packages/airflow/__main__.py", line 40, in main
args.func(args)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/cli_parser.py", line 48, in command
return func(*args, **kwargs)
File "/usr/local/lib/python3.9/site-packages/airflow/cli/commands/db_command.py", line 31, in initdb
db.initdb()
File "/usr/local/lib/python3.9/site-packages/airflow/utils/db.py", line 549, in initdb
upgradedb()
File "/usr/local/lib/python3.9/site-packages/airflow/utils/db.py", line 688, in upgradedb
command.upgrade(config, 'heads')
File "/usr/local/lib/python3.9/site-packages/alembic/command.py", line 298, in upgrade
script.run_env()
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 489, in run_env
util.load_python_file(self.dir, "env.py")
File "/usr/local/lib/python3.9/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file
module = load_module_py(module_id, path)
File "/usr/local/lib/python3.9/site-packages/alembic/util/compat.py", line 184, in load_module_py
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 790, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/usr/local/lib/python3.9/site-packages/airflow/migrations/env.py", line 108, in <module>
run_migrations_online()
File "/usr/local/lib/python3.9/site-packages/airflow/migrations/env.py", line 102, in run_migrations_online
context.run_migrations()
File "<string>", line 8, in run_migrations
File "/usr/local/lib/python3.9/site-packages/alembic/runtime/environment.py", line 846, in run_migrations
self.get_context().run_migrations(**kw)
File "/usr/local/lib/python3.9/site-packages/alembic/runtime/migration.py", line 511, in run_migrations
for step in self._migrations_fn(heads, self):
File "/usr/local/lib/python3.9/site-packages/alembic/command.py", line 287, in upgrade
return script._upgrade_revs(revision, rev)
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 364, in _upgrade_revs
revs = list(revs)
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 777, in _iterate_revisions
uppers = util.dedupe_tuple(self.get_revisions(upper))
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 321, in get_revisions
resolved_id, branch_label = self._resolve_revision_number(id_)
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 501, in _resolve_revision_number
self._revision_map
File "/usr/local/lib/python3.9/site-packages/alembic/util/langhelpers.py", line 230, in __get__
obj.__dict__[self.__name__] = result = self.fget(obj)
File "/usr/local/lib/python3.9/site-packages/alembic/script/revision.py", line 123, in _revision_map
for revision in self._generator():
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 112, in _load_revisions
script = Script._from_filename(self, vers, file_)
File "/usr/local/lib/python3.9/site-packages/alembic/script/base.py", line 906, in _from_filename
module = util.load_python_file(dir_, filename)
File "/usr/local/lib/python3.9/site-packages/alembic/util/pyfiles.py", line 98, in load_python_file
module = load_module_py(module_id, path)
File "/usr/local/lib/python3.9/site-packages/alembic/util/compat.py", line 184, in load_module_py
spec.loader.exec_module(module)
File "<frozen importlib._bootstrap_external>", line 790, in exec_module
File "<frozen importlib._bootstrap>", line 228, in _call_with_frames_removed
File "/usr/local/lib/python3.9/site-packages/airflow/migrations/versions/2c6edca13270_resource_based_permissions.py", line 27, in <module>
from airflow.www.app import create_app
File "/usr/local/lib/python3.9/site-packages/airflow/www/app.py", line 38, in <module>
from airflow.www.extensions.init_views import (
File "/usr/local/lib/python3.9/site-packages/airflow/www/extensions/init_views.py", line 29, in <module>
from airflow.www.views import lazy_add_provider_discovered_options_to_connection_form
File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 2836, in <module>
class ConnectionFormWidget(FormWidget):
File "/usr/local/lib/python3.9/site-packages/airflow/www/views.py", line 2839, in ConnectionFormWidget
field_behaviours = json.dumps(ProvidersManager().field_behaviours)
File "/usr/local/lib/python3.9/site-packages/airflow/providers_manager.py", line 111, in __init__
_create_customized_form_field_behaviours_schema_validator()
File "/usr/local/lib/python3.9/site-packages/airflow/providers_manager.py", line 53, in _create_customized_form_field_behaviours_schema_validator
importlib_resources.read_text('airflow', 'customized_form_field_behaviours.schema.json')
File "/usr/local/Cellar/[email protected]/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 139, in read_text
with open_text(package, resource, encoding, errors) as fp:
File "/usr/local/Cellar/[email protected]/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 121, in open_text
open_binary(package, resource), encoding=encoding, errors=errors)
File "/usr/local/Cellar/[email protected]/3.9.0_2/Frameworks/Python.framework/Versions/3.9/lib/python3.9/importlib/resources.py", line 91, in open_binary
return reader.open_resource(resource)
File "<frozen importlib._bootstrap_external>", line 995, in open_resource
FileNotFoundError: [Errno 2] No such file or directory: '/usr/local/lib/python3.9/site-packages/airflow/customized_form_field_behaviours.schema.json'
```
| https://github.com/apache/airflow/issues/13027 | https://github.com/apache/airflow/pull/13031 | 15fd1bc890aa1630ef16e7981408f8f994d30d97 | baa68ca51f93b3cea18efc24a7540a0ddf89c03d | "2020-12-12T00:42:57Z" | python | "2020-12-12T09:21:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,969 | ["airflow/cli/cli_parser.py", "airflow/cli/commands/task_command.py", "airflow/executors/celery_executor.py", "airflow/executors/local_executor.py", "airflow/task/task_runner/standard_task_runner.py"] | S3 Remote Logging not working | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: v2.0.0b3
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): 1.16.15
**Environment**:
- **Cloud provider or hardware configuration**: AWS
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**: Custom Helm Chart
- **Others**:
**What happened**:
S3 Remote Logging not working. Below is the stacktrace:
```
Running <TaskInstance: canary_dag.print_date 2020-12-09T19:46:17.200838+00:00 [queued]> on host canarydagprintdate-9fafada4409d4eafb5e6e9c7187810ae │
│ [2020-12-09 19:54:09,825] {s3_task_handler.py:183} ERROR - Could not verify previous log to append: 'NoneType' object is not callable │
│ Traceback (most recent call last): │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 179, in s3_write │
│ if append and self.s3_log_exists(remote_log_location): │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 141, in s3_log_exists │
│ return self.hook.check_for_key(remote_log_location) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper │
│ connection = self.get_connection(self.aws_conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection │
│ conn = Connection.get_connection_from_secrets(conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets │
│ conn = secrets_backend.get_connection(conn_id=conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper │
│ with create_session() as session: │
│ File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ │
│ return next(self.gen) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session │
│ session = settings.Session() │
│ TypeError: 'NoneType' object is not callable │
│ [2020-12-09 19:54:09,826] {s3_task_handler.py:193} ERROR - Could not write logs to s3://my-favorite-airflow-logs/canary_dag/print_date/2020-12-09T19:46:17.200838+00:00/2.log │
│ Traceback (most recent call last): │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/log/s3_task_handler.py", line 190, in s3_write │
│ encrypt=conf.getboolean('logging', 'ENCRYPT_S3_LOGS'), │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/providers/amazon/aws/hooks/s3.py", line 57, in wrapper │
│ connection = self.get_connection(self.aws_conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/hooks/base.py", line 63, in get_connection │
│ conn = Connection.get_connection_from_secrets(conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/models/connection.py", line 351, in get_connection_from_secrets │
│ conn = secrets_backend.get_connection(conn_id=conn_id) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 64, in wrapper │
│ with create_session() as session: │
│ File "/usr/local/lib/python3.7/contextlib.py", line 112, in __enter__ │
│ return next(self.gen) │
│ File "/home/airflow/.local/lib/python3.7/site-packages/airflow/utils/session.py", line 29, in create_session │
│ session = settings.Session() │
│ TypeError: 'NoneType' object is not callable
stream closed
```
**What you expected to happen**
Able to see the task instance logs in the airflow UI being read from S3 remote location.
**How to reproduce it**:
Pulled the latest master and created an airflow image from the dockerfile mentioned in the repo.
| https://github.com/apache/airflow/issues/12969 | https://github.com/apache/airflow/pull/13057 | 6bf9acb90fcb510223cadc1f41431ea5f57f0ca1 | ab5f770bfcd8c690cbe4d0825896325aca0beeca | "2020-12-09T20:21:42Z" | python | "2020-12-14T16:28:01Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,912 | ["airflow/jobs/scheduler_job.py", "tests/jobs/test_scheduler_job.py"] | dagrun_timeout doesn't kill task instances on timeout | **Apache Airflow version**:
1.10.12
**What happened**:
I created dag with dagrun_timeout=2 minutes.
After 2 minutes dagrun is marked as failed and the next one is started, but task keeps going.
**What you expected to happen**:
Task is killed with dag run as it is done when you mark dagrun failed manually.
**How to reproduce it**:
```
dag = DAG(dag_id='platform.airflow-test',
description='',
schedule_interval="0 0 * * *",
start_date=datetime(2020, 7, 1),
max_active_runs=1,
catchup=True,
dagrun_timeout=timedelta(minutes=2))
run_this = BashOperator(
task_id='run_after_loop',
bash_command=' for((i=1;i<=600;i+=1)); do echo "Welcome $i times"; sleep 1; done',
dag=dag,
)
``` | https://github.com/apache/airflow/issues/12912 | https://github.com/apache/airflow/pull/14321 | 9f37af25ae7eb85fa8dbb70b7dbb23bbd5505323 | 09327ba6b371aa68cf681747c73a7a0f4968c173 | "2020-12-08T09:36:09Z" | python | "2021-03-05T00:45:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,909 | [".github/workflows/scheduled_quarantined.yml"] | Quarantined Build is broken | Seems like the script `./scripts/ci/tools/ci_check_if_tests_should_be_run.sh` has been removed from code between release 1.10.12 & 1.10.13, and since then the Quarantined Build is broken https://github.com/apache/airflow/actions/runs/405827008
cc - @potiuk
| https://github.com/apache/airflow/issues/12909 | https://github.com/apache/airflow/pull/13288 | c2bedd580c3dd0e971ac394be25e331ba9c1c932 | c4809885ecd7ec1a92a1d8d0264234d86479bf24 | "2020-12-08T05:29:46Z" | python | "2020-12-23T17:52:30Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,881 | ["Dockerfile", "Dockerfile.ci", "IMAGES.rst", "scripts/in_container/_in_container_utils.sh", "scripts/in_container/run_ci_tests.sh", "scripts/in_container/run_install_and_test_provider_packages.sh", "scripts/in_container/run_prepare_provider_readme.sh", "setup.py", "tests/providers/presto/hooks/test_presto.py"] | Snowflake python connector monkeypatches urllib and makes many services unusable. | Curreently wnen you run snowflke provider, it monkeypatches urlllb in a way that is not compatible with other libraries (for example presto SSL with kerberos, google, amazon, qubole and many others).
This is not critical (as in 2.0 we have provider separation and snowflake code will not even be there until you choose [snowflake] extra or install provider manually,
For now we decided to release but immediately yank the snowflake provider!
Additional links:
* Issue: https://github.com/snowflakedb/snowflake-connector-python/issues/324
Offending code:
* https://github.com/snowflakedb/snowflake-connector-python/blob/133d6215f7920d304c5f2d466bae38127c1b836d/src/snowflake/connector/network.py#L89-L92
| https://github.com/apache/airflow/issues/12881 | https://github.com/apache/airflow/pull/13654 | 821194beead51868ce360dfc096dbab91760cc37 | 6e90dfc38b1bf222f47acc2beb1a6c7ceccdc8dc | "2020-12-07T12:47:04Z" | python | "2021-01-16T11:52:56Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,877 | ["setup.cfg"] | ImportError: cannot import name '_Union' from 'typing' (/usr/lib/python3.9/typing.py) | **Apache Airflow version**: 1.10.3
**Environment**:
- **OS** (e.g. from /etc/os-release): Arch Linux
- **Kernel** (e.g. `uname -a`): Linux 5.9.11-arch2-1 #1 SMP PREEMPT Sat, 28 Nov 2020 02:07:22 +0000 x86_64 GNU/Linux
- **Install tools**: pip 2.3.1 (with _--use-deprecated legacy-resolver_)
- **Others**: python 3.9
**What happened**:
```
(env) ➜ project-airflow git:(feature-implementation) ✗ ./env/bin/airflow webserver
Traceback (most recent call last):
File "/home/user/dev/project-airflow/./env/bin/airflow", line 26, in <module>
from airflow.bin.cli import CLIFactory
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/bin/cli.py", line 95, in <module>
api_module = import_module(conf.get('cli', 'api_client')) # type: Any
File "/usr/lib/python3.9/importlib/__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/api/client/local_client.py", line 24, in <module>
from airflow.api.common.experimental import delete_dag
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/api/common/experimental/delete_dag.py", line 26, in <module>
from airflow.models.serialized_dag import SerializedDagModel
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/models/serialized_dag.py", line 35, in <module>
from airflow.serialization.serialized_objects import SerializedDAG
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/airflow/serialization/serialized_objects.py", line 28, in <module>
import cattr
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/__init__.py", line 2, in <module>
from .converters import Converter, UnstructureStrategy
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/converters.py", line 15, in <module>
from ._compat import (
File "/home/user/dev/project-airflow/env/lib/python3.9/site-packages/cattr/_compat.py", line 87, in <module>
from typing import _Union
ImportError: cannot import name '_Union' from 'typing' (/usr/lib/python3.9/typing.py)
```
**How to reproduce it**:
Try launch airflow webserver with python **3.9**
**Anything else we need to know**:
-- | https://github.com/apache/airflow/issues/12877 | https://github.com/apache/airflow/pull/13223 | f95b1c9c95c059e85ad5676daaa191929785fee2 | 9c0a5df22230105eb3a571c040daaba3f9cadf37 | "2020-12-07T10:19:45Z" | python | "2020-12-21T20:36:54Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,852 | ["IMAGES.rst", "README.md"] | The README file in this repo has a bad link - [404:NotFound] "production-deployment.html" | **Apache Airflow version**:
N/A
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
N/A
**Environment**:
N/A
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): N/A
- **Kernel** (e.g. `uname -a`): N/A
- **Install tools**: N/A
- **Others**: N/A
**What happened**:
The link under “Latest docs” gives
Status code [404:NotFound] - Link: https://github.com/apache/airflow/blob/master/docs/production-deployment.html
**What you expected to happen**:
The link should point to an actual file.
The closest name file I could find is “https://github.com/apache/airflow/blob/master/docs/apache-airflow/production-deployment.rst”
But I was not sure if this is what the link should be pointing to or not(??)
**How to reproduce it**:
Click the link in the main page for this repo
## Install minikube/kind
N/A
**Anything else we need to know**:
This bad link was found by a tool I recently created as part of an new experimental hobby project: https://github.com/MrCull/GitHub-Repo-ReadMe-Dead-Link-Finder
Re-check this Repo via: http://githubreadmechecker.com/Home/Search?SingleRepoUri=https%3a%2f%2fgithub.com%2fapache%2fairflow
Check all Repos for this GitHub account: http://githubreadmechecker.com/Home/Search?User=apache
--
I (a human) verified that this link is broken and have manually logged this Issue (i.e. this Issue has not been created by a bot).
If this has been in any way helpful then please consider giving the above Repo a Star.
If you have any feedback on the information provided here, or on the tool itself, then please feel free to share your thoughts and pass on the feedback, or long an “Issue”.
| https://github.com/apache/airflow/issues/12852 | https://github.com/apache/airflow/pull/12854 | a00f25011fc6c859b27b6c78b9201880cf6323ce | 3663d1519eb867b6bb152b27b93033666993511a | "2020-12-06T13:53:21Z" | python | "2020-12-07T00:05:21Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,832 | ["dev/README_RELEASE_AIRFLOW.md", "dev/sign.sh"] | Source hash apache-airflow-1.10.13-bin.tar.gz.sha512 format is invalid | **Description**
The sha256sum file for apache-airflow releases is in an unexpected format for python-based checksum modules.
**Current file format:**
apache-airflow-1.10.13rc1-bin.tar.gz: 36D641C0 F2AAEC4E BCE91BD2 66CE2BC6
AA2D995C 08C9B62A 0EA1CBEC 027E657B
8AF4B54E 6C3AD117 9634198D F6EA53F8
163711BA 95586B5B 7BCF7F4B 098A19E2
**Wanted formats**
xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx amd64\apache-airflow-1.10.13-bin.tar.gz
**Or**
`xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx`
**Use case / motivation**
Ansible and salt python libraries to consume checksums do not understand the format...
```
ID: airflow-archive-install
Function: archive.extracted
Name: /opt/apache-airflow-1.10.13/bin/
Result: False
Comment: Attempt 1: Returned a result of "False", with the following comment: "Source hash https://github.com/apache/airflow/releases/download/1.10.13/ap
ache-airflow-1.10.13-bin.tar.gz.sha512 format is invalid. The supported formats are: 1) a hash, 2) an expression in the format <hash_type>=<hash>, or 3) eithe
r a path to a local file containing hashes, or a URI of a remote hash file. Supported protocols for remote hash files are: salt, file, http, https, ftp, swift
, s3. The hash may also not be of a valid length, the following are supported hash types and lengths: md5 (32), sha1 (40), sha224 (56), sha256 (64), sha384 (9
6), sha512 (128)."
......etc
Started: 11:39:44.082079
Duration: 123506.098 ms
```
**Related Issues**
No
| https://github.com/apache/airflow/issues/12832 | https://github.com/apache/airflow/pull/12867 | 298c88a434325dd6df8f374057709022e0b0811f | a00f25011fc6c859b27b6c78b9201880cf6323ce | "2020-12-05T12:01:35Z" | python | "2020-12-06T23:46:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,827 | ["airflow/config_templates/default_webserver_config.py", "docs/apache-airflow/security/webserver.rst"] | Missing docs about webserver_config.py | Hello,
We are missing documentation on the `webserver_config.py` file. I think it is worth answering the following questions in this guide:
* What is this file?
* What is this file for?
* When and how should you edit this file?
Best regards,
Kamil Breguła | https://github.com/apache/airflow/issues/12827 | https://github.com/apache/airflow/pull/13155 | 23a47879ababe76f6cf9034a2bae055b2a91bf1f | 81fed8072d1462ab43818bb7757ade4b67982976 | "2020-12-05T05:45:30Z" | python | "2020-12-20T01:21:49Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,807 | ["airflow/config_templates/config.yml", "airflow/config_templates/default_airflow.cfg", "airflow/configuration.py", "airflow/models/baseoperator.py", "tests/core/test_configuration.py", "tests/models/test_baseoperator.py"] | add default weight_rule to airflow.cfg | **Description**
It would be nice if the weight_rule default value could be managed by a global config
suggested config:
```
# Weighting method used for the effective total priority weight of the task.
# Options are: { downstream | upstream | absolute } default is
default_weight_rule = downstream
```
**Use case / motivation**
In some pipeline, you really need to have absolute weight, and then you have to add a line in each task definition which is annoying
| https://github.com/apache/airflow/issues/12807 | https://github.com/apache/airflow/pull/18627 | d0ffd31ba3a4e8cd27fb7305cc19c33cf637509f | d79f506213297dc0dc034d6df3226361b6f95d7a | "2020-12-04T10:02:56Z" | python | "2021-09-30T14:53:55Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,796 | ["airflow/providers/http/sensors/http.py"] | Make headers templated in HttpSensor | **Description**
Make HttpSensor `headers` parameter templated.
**Use case / motivation**
This would allow for passing data from other tasks, such as an API token, in the headers.
**Related Issues**
N/A | https://github.com/apache/airflow/issues/12796 | https://github.com/apache/airflow/pull/12809 | 37afe55775676e2cb4cf6ed0cfc6c892855d6805 | c1cd50465c5473bc817fded5eeb4c425a0529ae5 | "2020-12-03T20:57:43Z" | python | "2020-12-05T00:59:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,783 | ["airflow/models/baseoperator.py", "airflow/sensors/base_sensor_operator.py", "airflow/serialization/schema.json", "airflow/serialization/serialized_objects.py", "tests/serialization/test_dag_serialization.py"] | Sensors in reschedule mode are not rescheduled | **Apache Airflow version**:
2.0.0dev
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
```
./breeze --python=3.8 --backend=postgres --db-reset restart
```
**What happened**:
Sensors in reschedule mode are not rescheduled by scheduler.
**What you expected to happen**:
Sensors in both poke and reschedule mode should work.
**How to reproduce it**:
```
from airflow import DAG
from airflow.sensors.base_sensor_operator import BaseSensorOperator
from airflow.utils.dates import days_ago
class DummySensor(BaseSensorOperator):
def poke(self, context):
return False
with DAG(
"other_dag",
start_date=days_ago(1),
schedule_interval="*/5 * * * *",
catchup=False
) as dag3:
DummySensor(
task_id='wait-task',
poke_interval=60 * 5,
mode='reschedule'
)
```
Then:
```
root@053f6ca34e24: /opt/airflow# airflow dags unpause other_dag
Dag: other_dag, paused: False
root@053f6ca34e24: /opt/airflow# airflow scheduler
____________ _____________
____ |__( )_________ __/__ /________ __
____ /| |_ /__ ___/_ /_ __ /_ __ \_ | /| / /
___ ___ | / _ / _ __/ _ / / /_/ /_ |/ |/ /
_/_/ |_/_/ /_/ /_/ /_/ \____/____/|__/
[2020-12-03 14:18:58,404] {scheduler_job.py:1247} INFO - Starting the scheduler
[2020-12-03 14:18:58,404] {scheduler_job.py:1252} INFO - Processing each file at most -1 times
[2020-12-03 14:18:58,571] {dag_processing.py:250} INFO - Launched DagFileProcessorManager with pid: 63835
[2020-12-03 14:18:58,576] {scheduler_job.py:1757} INFO - Resetting orphaned tasks for active dag runs
[2020-12-03 14:18:58,660] {settings.py:52} INFO - Configured default timezone Timezone('UTC')
[2020-12-03 14:18:58,916] {scheduler_job.py:944} INFO - 1 tasks up for execution:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:18:58,920] {scheduler_job.py:973} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-12-03 14:18:58,921] {scheduler_job.py:1001} INFO - DAG other_dag has 0/16 running and queued tasks
[2020-12-03 14:18:58,921] {scheduler_job.py:1066} INFO - Setting the following tasks to queued state:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:18:58,925] {scheduler_job.py:1108} INFO - Sending TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue default
[2020-12-03 14:18:58,926] {base_executor.py:79} INFO - Adding to queue: ['airflow', 'tasks', 'run', 'other_dag', 'wait-task', '2020-12-03T14:10:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/files/dags/the_old_issue.py']
[2020-12-03 14:18:58,935] {local_executor.py:80} INFO - QueuedLocalWorker running ['airflow', 'tasks', 'run', 'other_dag', 'wait-task', '2020-12-03T14:10:00+00:00', '--local', '--pool', 'default_pool', '--subdir', '/files/dags/the_old_issue.py']
[2020-12-03 14:18:59,063] {dagbag.py:440} INFO - Filling up the DagBag from /files/dags/the_old_issue.py
Running <TaskInstance: other_dag.wait-task 2020-12-03T14:10:00+00:00 [queued]> on host 053f6ca34e24
[2020-12-03 14:19:00,022] {scheduler_job.py:944} INFO - 1 tasks up for execution:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:19:00,029] {scheduler_job.py:973} INFO - Figuring out tasks to run in Pool(name=default_pool) with 128 open slots and 1 task instances ready to be queued
[2020-12-03 14:19:00,029] {scheduler_job.py:1001} INFO - DAG other_dag has 0/16 running and queued tasks
[2020-12-03 14:19:00,029] {scheduler_job.py:1066} INFO - Setting the following tasks to queued state:
<TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [scheduled]>
[2020-12-03 14:19:00,033] {scheduler_job.py:1108} INFO - Sending TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1) to executor with priority 1 and queue default
[2020-12-03 14:19:00,033] {base_executor.py:82} ERROR - could not queue task TaskInstanceKey(dag_id='other_dag', task_id='wait-task', execution_date=datetime.datetime(2020, 12, 3, 14, 10, tzinfo=Timezone('UTC')), try_number=1)
[2020-12-03 14:19:00,038] {scheduler_job.py:1199} INFO - Executor reports execution of other_dag.wait-task execution_date=2020-12-03 14:10:00+00:00 exited with status success for try_number 1
[2020-12-03 14:19:00,045] {scheduler_job.py:1235} ERROR - Executor reports task instance <TaskInstance: other_dag.wait-task 2020-12-03 14:10:00+00:00 [queued]> finished (success) although the task says its queued. (Info: None) Was the task killed externally?
[2020-12-03 14:19:01,173] {dagrun.py:429} ERROR - Marking run <DagRun other_dag @ 2020-12-03 14:10:00+00:00: scheduled__2020-12-03T14:10:00+00:00, externally triggered: False> failed
```
**Anything else we need to know**:
Discovered when working on #10790
Thank @nathadfield for helping discover this issue!
| https://github.com/apache/airflow/issues/12783 | https://github.com/apache/airflow/pull/12858 | 75d8ff96b4e7736b177c3bb8e949653d6a501736 | c045ff335eecb5c72aeab9e7f01973c18f678ff7 | "2020-12-03T13:52:28Z" | python | "2020-12-06T21:55:53Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,780 | ["PULL_REQUEST_WORKFLOW.rst", "scripts/ci/selective_ci_checks.sh"] | K8S were not run on cli change | In https://github.com/apache/airflow/pull/12725 selective checks did not run K8S tests. | https://github.com/apache/airflow/issues/12780 | https://github.com/apache/airflow/pull/13305 | e9d65bd4582b083914f2fc1213bea44cf41d1a08 | e2bfac9fc874a6dd1eb52a067313f43ec94307e3 | "2020-12-03T09:36:39Z" | python | "2020-12-24T14:48:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,776 | ["airflow/migrations/versions/4addfa1236f1_add_fractional_seconds_to_mysql_tables.py", "airflow/migrations/versions/d2ae31099d61_increase_text_size_for_mysql.py", "airflow/migrations/versions/e959f08ac86c_change_field_in_dagcode_to_mediumtext_.py", "airflow/models/dagcode.py"] | Update source_code field of dag_code table to MEDIUMTEXT | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
Update source_code field of dag_code table to MEDIUMTEXT
<!-- A short description of your feature -->
**Use case / motivation**
Lot of dags exceed the limit of 65K characters limit giving error `"Data too long for column 'source_code' at row 1"` when enabling webserver to fetch dag_code from db.
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12776 | https://github.com/apache/airflow/pull/12890 | b11551278a703e2e742969ac554908f16f235809 | f66a46db88da86b4a11c5ee142c09a5001c32c41 | "2020-12-03T08:42:01Z" | python | "2020-12-07T22:17:22Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,773 | ["airflow/config_templates/config.yml", "docs/apache-airflow/configurations-ref.rst"] | Incomplete list of environment variables that override configuration | Hello,
In our configuration reference docs, we provide information about the environment variables that affect the options.
<img width="430" alt="Screenshot 2020-12-03 at 09 14 33" src="https://user-images.githubusercontent.com/12058428/100982181-07389c00-3548-11eb-9089-fe00c4b9367f.png">
Unfortunately, this list is not complete. Some configuration options can also be set using `AIRFLOW__{SECTION}__{OPTION}__SECRET` or `AIRFLOW__{SECTION}__{OPTION}__CMD` env variable. See: http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/howto/set-config.html
| https://github.com/apache/airflow/issues/12773 | https://github.com/apache/airflow/pull/12820 | e82cf0d01d6c1e1ec65d8e1b70d65158947fccd2 | c85f49454de63f5857bf477a240229a71f0e78ff | "2020-12-03T08:17:58Z" | python | "2020-12-05T06:00:18Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,769 | ["docs/apache-airflow/upgrading-to-2.rst"] | Documentation needed for DB upgrade as part of 2.0 | Following up on the dev call on 30th of November, there was a clear desire expressed for documentation around the database upgrade process from Airflow 1.10.14 (or equivalent) to Airflow 2.0. Though the upgrade process is conceptually no different from a regular 1.10.x to a 1.10.x+1 release, the fact that there are significant known database changes may raise concerns in the minds of Airflow users as part of the upgrade.
To ease their concerns, the following questions should ideally be answered as part of the documentation specifically either as part of the "Upgrading to 2.0 document" or linked from there.
Q 1. Is there anything "special" which I need to be done to upgrade from 1.10.x to 2.0 with respect to the database?
Ans. I don't believe so, other than the normal upgrade checks.
Q 2. How long should I expect this database upgrade expected to take?
Ans. I am not quite sure how to answer this since it depends on the data. We can possibly share sample times based on tested data sets.
Q 3. Can I do something to reduce the database upgrade time?
Ans. A couple of options here. One possibility is to recommend the maintenance DAGs to be run to archive / delete older task history, xcom data, and equivalent. Another possibility is to provide a script for them to run as part of the Airflow project distribution, possibly part of upgrade check scripts.
| https://github.com/apache/airflow/issues/12769 | https://github.com/apache/airflow/pull/13005 | 3fbc8e650dcd398bc2844b7b3d92748423c7611a | 0ffd5fa3d87e78126807e6cdb4b1b29154047153 | "2020-12-03T01:57:37Z" | python | "2020-12-11T16:20:35Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,757 | ["airflow/models/baseoperator.py", "tests/utils/test_task_group.py"] | Graph View is empty when Operator has multiline string in args (v2.0) | Airflow v2.0b3
Kubernetes v1.19.3
Discovered issue while testing KubernetesPodOperator (haven't tested with other operator).
If I create a multiline string using """ """", add some variables inside (Jinja templating), then use this string as an argument to KubernetesPodOperator:
- In Graph View DAG is not visible (just gray area where it should be a digraph);
- in browser's web console i see the following error:
`Uncaught TypeError: node is undefined
preProcessGraph http://localhost:8080/static/dist/dagre-d3.min.js:103
preProcessGraph http://localhost:8080/static/dist/dagre-d3.min.js:103
fn http://localhost:8080/static/dist/dagre-d3.min.js:103
call http://localhost:8080/static/dist/d3.min.js:3
draw http://localhost:8080/graph?dag_id=mydag&execution_date=mydate
expand_group http://localhost:8080/graph?dag_id=mydag&execution_date=mydate
<anonymous> http://localhost:8080/graph?dag_id=mydag&execution_date=mydate`
Tree view works without issues in this case. The DAG succeeds. | https://github.com/apache/airflow/issues/12757 | https://github.com/apache/airflow/pull/12829 | cd66450b4ee2a219ddc847970255e420ed679700 | 12ce5be77f64c335dce12c3586d2dc7b63491d34 | "2020-12-02T14:59:06Z" | python | "2020-12-05T11:52:55Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,751 | ["chart/templates/flower/flower-service.yaml", "chart/templates/webserver/webserver-service.yaml", "chart/tests/test_flower.py", "chart/tests/test_webserver.py", "chart/values.schema.json", "chart/values.yaml"] | Helm Chart: Provide option to specify loadBalancerIP in webserver service | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
The current service type for `webserver` is defaulted at `ClusterIP`.
I am able to change it to `LoadBalancer` type, but the I was not able to specify the static IP.
So every time we reinstall the chart, it will change the assigned IP of the loadbalancer being provisioned to us.
<!-- A short description of your feature -->
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12751 | https://github.com/apache/airflow/pull/15972 | bb43e06c75dd6cafc094813347f7a7b13cb9374e | 9875f640ca19dabd846c17f4278ccc90e189ae8d | "2020-12-02T04:19:48Z" | python | "2021-05-21T23:06:09Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,748 | ["codecov.yml"] | Code Coverage is Broken | https://codecov.io/github/apache/airflow?branch=master CodeCov code-coverage is broken on Master. It wasn't great but still useful to check which sections needed lacks tests.
cc @potiuk | https://github.com/apache/airflow/issues/12748 | https://github.com/apache/airflow/pull/13092 | 0eb210df3e10b478a567291355bc269150c93ae5 | ae98c074032861b07d6945a8f6f493b319dcc374 | "2020-12-01T23:46:36Z" | python | "2020-12-15T21:33:39Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,744 | ["setup.cfg", "setup.py"] | Difference of extras Airflow 2.0 vs. Airflow 1.10 | **Description**
When airflow 2.0 is installed from PyPI, providers are not installed by default. In order to install them, you should add an appropriate extra. While this behavior is identical in Airflow 1.10 for those "providers" that required additional packages, there were a few "providers" that did not require any extras to function (example http, ftp) - we have "http", "ftp" extras for them now, but maybe some of those are popular enough to be included by default?.
We have to make a decision now:
- [x] should all of them (or some of them) be included by default when you install Airflow?
- [x] if we decide to exclude only some (or none), we should add them in UPGRADING_to_2_0 and in UPDATING documentation.
**Use case / motivation**
We want people to get a familiar experience when installing airflow. Why we provide familiar mechanism (with extras) and people will expect a slightly different configurations, installation and we can describe the differences, maybe some of those providers are so popular that we should include them by default?
**Related Issues**
#12685 - where we discuss which of the extras should be included in the Production Image of 2.0.
**Additional info**
Here is the list of all "providers" that were present in 1.10 and had no additional dependencies - so basically they woudl work out-fhe-box in 1.10, but they need appropriate "extra" in 2.0.
* "apache.pig": [],
* "apache.sqoop": [],
* "dingding": [],
* "discord": [],
* "ftp": [],
* "http": [],
* "imap": [],
* "openfaas": [],
* "opsgenie": [],
* "sqlite": [],
Also here I appeal to the wisdom of crowd: @ashb, @dimberman @kaxil, @turbaszek, @mik-laj. @XD-DENG, @feluelle, @eladkal, @ryw, @vikramkoka, @KevinYang21 - let me know WDYT before I bring it to devlist? | https://github.com/apache/airflow/issues/12744 | https://github.com/apache/airflow/pull/12916 | 9b39f24780e85f859236672e9060b2fbeee81b36 | e7c1771cba16e3f554d6de5f77c97e49b16f7bed | "2020-12-01T18:44:37Z" | python | "2020-12-08T15:22:47Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,726 | ["docs/apache-airflow/tutorial_taskflow_api.rst"] | Add classic operator in TaskFlow API tutorial |
**Description**
TaskFlow API tutorial should add an example to use a classic operator (example: EmailOperator) so that users know that it can be leveraged.
Alternatively, it should add references to how to add dependencies (implicit or explicit) to classic operators.
**Use case / motivation**
It's not super clear how can TaskFlow API be used with existing operators (aka PostgresOperator, EmailOperator...). Adding an example, will facilitate users to get a picture of what can be done with this.
| https://github.com/apache/airflow/issues/12726 | https://github.com/apache/airflow/pull/19214 | 2fdcb8a89cd1aaf1a90657385a257e58926c21a9 | 2dfe85dcb4923f1c4cce8b1570561f11cf07c186 | "2020-12-01T00:27:48Z" | python | "2021-10-29T16:44:50Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,691 | ["airflow/www/templates/airflow/dag_details.html"] | add dagrun_timeout to the DAG Details screen in the UI | In the Details page the is no indication of the DAG `dagrun_timeout` | https://github.com/apache/airflow/issues/12691 | https://github.com/apache/airflow/pull/14165 | 92f81da91cc337e18e5aa77d445d0a8ab7d32600 | 61b613359e2394869070b3ad94f64dfda3efac74 | "2020-11-28T18:02:57Z" | python | "2021-02-10T20:25:04Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,666 | ["airflow/migrations/versions/33ae817a1ff4_add_kubernetes_resource_checkpointing.py", "airflow/migrations/versions/bef4f3d11e8b_drop_kuberesourceversion_and_.py"] | Error: table kube_resource_version already exists when calling reset_db() with SQLite backend | Issue encountered during alembic migration when running Airflow 2.0.0b3 locally using a sqlite backend and calling airflow.utils.db.resetdb():
**sqlalchemy.exc.OperationalError: (sqlite3.OperationalError) table kube_resource_version already exists
[SQL:
CREATE TABLE kube_resource_version (
one_row_id BOOLEAN DEFAULT (1) NOT NULL,
resource_version VARCHAR(255),
PRIMARY KEY (one_row_id),
CONSTRAINT kube_resource_version_one_row_id CHECK (one_row_id),
CHECK (one_row_id IN (0, 1))
)
]**
| https://github.com/apache/airflow/issues/12666 | https://github.com/apache/airflow/pull/12670 | fa8af2d16551e287673d94a40cfb41e49d685412 | 704e724cc127c9ca6c9f0f51997c9d057b697aec | "2020-11-27T19:22:55Z" | python | "2020-11-27T22:49:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,665 | ["scripts/in_container/run_generate_constraints.sh"] | Constraints behaviour changes in new PIP | We have this warning when running the latest PIP, so we have to take a close look what it means to us:
```
pip install 'https://github.com/apache/airflow/archive/master.tar.gz#egg=apache-airflow[devel_ci]'
--constraint https://raw.githubusercontent.com/apache/airflow/constraints-master/constraints-3.6.txt
DEPRECATION: Constraints are only allowed to take the form of a package
name and a version specifier. Other forms were originally permitted
as an accident of the implementation, but were undocumented. The new implementation
of the resolver no longer supports these forms. A possible replacement is replacing
the constraint with a requirement.. You can find discussion regarding
this at https://github.com/pypa/pip/issues/8210.
```
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
**Description**
<!-- A short description of your feature -->
**Use case / motivation**
<!-- What do you want to happen?
Rather than telling us how you might implement this solution, try to take a
step back and describe what you are trying to achieve.
-->
**Related Issues**
<!-- Is there currently another issue associated with this? -->
| https://github.com/apache/airflow/issues/12665 | https://github.com/apache/airflow/pull/12671 | 944bd4c658e9793c43c068e5359f816ded4f0b40 | 3b138d2d60d86ca0a80e9c27afd3421f45df178e | "2020-11-27T19:18:14Z" | python | "2020-11-28T05:04:45Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,632 | ["chart/templates/webserver/webserver-deployment.yaml", "chart/tests/test_webserver_deployment.py"] | helm chart: webserver replies with 404 "Apache Airflow is not at this location" when using config.webserver.base_url | I tried to deploy with helm values:
ingress.web.path: /airflow/*
config.webserver.base_url: https://myhostname/airflow
It doesn't work because then
* the livenessProbe uses `/health` instead of `/airflow/health`
* I don't think the livenessProbe sends the appropriate `Host` header, so even if it requested `/airflow/health` it will return 404 because airflow webserver thinks the requested url is `http://localhost:8080/airflow/health` instead of `http://myhostname/airflow/health`
If I open a shell to the running pod for the `webserver` container with
kubectl -n airflow-test exec -it airflow-test-webserver-569f8bb5f7-gw9rj -c webserver -- /bin/bash
and perform the query with
curl -v --header "Host: myhostname" --header "X-Forwarded-Host: myhostnamme" http://localhost:8080/airflow/login/ # this works
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): EKS 1.18
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12632 | https://github.com/apache/airflow/pull/12634 | 8ecdef3e50d3b83901d70a13794ae6afabc4964e | 75fd5f8254a6ecf616475a485f6da76240a34776 | "2020-11-25T22:08:17Z" | python | "2021-01-12T12:03:39Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,616 | ["airflow/api_connexion/schemas/dag_run_schema.py", "tests/api_connexion/schemas/test_dag_run_schema.py"] | Triggering a DAGRUN with invalid execution_date causes a ParserError in the REST API | Triggering a dagrun with an invalid execution date causes a ParserError in the REST API
**Apache Airflow version**: 2.0
**What happened**:
When you trigger a dagrun with an invalid execution date, it returns an HTML response showing the error
**What you expected to happen**:
I expected a JSON response showing that the execution date is invalid
**How to reproduce it**:
1. Start airflow webserver and scheduler
2. Make a post request to this endpoint `http://localhost:28080/api/v1/dags/example_bash_operator/dagRuns` with this request body `{"execution_date": "mydate"}`
3. This will return an HTML page instead of JSON
| https://github.com/apache/airflow/issues/12616 | https://github.com/apache/airflow/pull/12618 | 56f82ba22519b0cf2cb0a1f7c4d083db7f2e3358 | b62abfbfae5ae84f62522ce5db5852598caf9eb8 | "2020-11-25T12:59:16Z" | python | "2020-12-03T16:04:14Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,588 | ["airflow/migrations/versions/61ec73d9401f_add_description_field_to_connection.py", "airflow/migrations/versions/64a7d6477aae_fix_description_field_in_connection_to_.py", "airflow/models/connection.py"] | Connection description migration breaks on MySQL 8 | The migration added in #10873 doesn't work on Mysql8 -- it is too long for a text column with utf8 collation.
```
Running upgrade 2c6edca13270 -> 61ec73d9401f, Add description field to connection
Traceback (most recent call last):
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/base.py", line 1277, in _execute_context
cursor, statement, parameters, context
File "/usr/local/lib/python3.6/site-packages/sqlalchemy/engine/default.py", line 593, in do_execute
cursor.execute(statement, parameters)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 255, in execute
self.errorhandler(self, exc, value)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/connections.py", line 50, in defaulterrorhandler
raise errorvalue
File "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 252, in execute
res = self._query(query)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/cursors.py", line 378, in _query
db.query(q)
File "/usr/local/lib/python3.6/site-packages/MySQLdb/connections.py", line 280, in query
_mysql.connection.query(self, query)
_mysql_exceptions.OperationalError: (1118, 'Row size too large. The maximum row size for the used table type, not counting BLOBs, is 65535. This includes storage overhead, check the manual. You have to change some columns to TEXT or BLOBs')
```
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12588 | https://github.com/apache/airflow/pull/12596 | 950d80bd98aef63905db9b01c7b8658d06c6f858 | cdaaff12c7c80311eba22dcb856fe9c24d7f49aa | "2020-11-24T14:30:03Z" | python | "2020-11-25T13:30:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,554 | ["docs/index.rst"] | Remove or limit table of content at the main Airflow doc page | Table of content here : http://apache-airflow-docs.s3-website.eu-central-1.amazonaws.com/docs/apache-airflow/latest/index.html at the main page of "apache-airflow" documentation is huge and useless (especially that we have it in directory on the left).
We should remove it or limit heavily (to 1st level only)

| https://github.com/apache/airflow/issues/12554 | https://github.com/apache/airflow/pull/12561 | b57b9321133a28126e17d17885c80dc04a2e121e | 936566c586e6cbb155ffa541e89a31f7239f51bb | "2020-11-22T19:34:52Z" | python | "2020-11-24T12:01:03Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,550 | ["airflow/www/templates/airflow/trigger.html", "airflow/www/views.py", "tests/www/test_views.py"] | Ability to provide sample conf JSON for a dag in trigger page | **Description**
In the trigger page, there is a text area to enter (optional) conf json. It would be great if a sample JSON can be provided programatically while defining DAG
**Use case / motivation**
This will improve usability of the UI that triggers a DAG
| https://github.com/apache/airflow/issues/12550 | https://github.com/apache/airflow/pull/13365 | b2cb6ee5ba895983e4e9d9327ff62a9262b765a2 | 0e510b2f2bb03ed9344df664b123920e70382fd1 | "2020-11-22T16:40:47Z" | python | "2021-01-07T02:33:23Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,537 | ["airflow/providers/docker/CHANGELOG.rst", "airflow/providers/docker/example_dags/example_docker_copy_data.py", "airflow/providers/docker/operators/docker.py", "airflow/providers/docker/operators/docker_swarm.py", "airflow/providers/docker/provider.yaml", "docs/conf.py", "docs/exts/docs_build/third_party_inventories.py", "tests/providers/docker/operators/test_docker.py", "tests/providers/docker/operators/test_docker_swarm.py"] | Mounting directories using docker operator on airflow is not working | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: apache-airflow==1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): Does not apply
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): Ubuntu 18.04.5 LTS bionic
- **Kernel** (e.g. `uname -a`): Linux letyndr-letyndr 4.15.0-123-generic #126-Ubuntu SMP Wed Oct 21 09:40:11 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
- **Install tools**:
- **Others**:
**What happened**:
I'm trying to use the docker operator to automate the execution of some scripts using airflow.
What I want to do is to "copy" all my project's files (with folders and files) to the container using this code.
The following file ml-intermediate.py is in this directory ~/airflow/dags/ml-intermediate.py:
```
"""
Template to convert a Ploomber DAG to Airflow
"""
from airflow import DAG
from airflow.operators.bash_operator import BashOperator
from airflow.utils.dates import days_ago
from ploomber.spec import DAGSpec
from soopervisor.script.ScriptConfig import ScriptConfig
script_cfg = ScriptConfig.from_path('/home/letyndr/airflow/dags/ml-intermediate')
# Replace the project root to reflect the new location - or maybe just
# write a soopervisor.yaml, then we can we rid of this line
script_cfg.paths.project = '/home/letyndr/airflow/dags/ml-intermediate'
# TODO: use lazy_import from script_cfg
dag_ploomber = DAGSpec('/home/letyndr/airflow/dags/ml-intermediate/pipeline.yaml',
lazy_import=True).to_dag()
dag_ploomber.name = "ML Intermediate"
default_args = {
'start_date': days_ago(0),
}
dag_airflow = DAG(
dag_ploomber.name.replace(' ', '-'),
default_args=default_args,
description='Ploomber dag',
schedule_interval=None,
)
script_cfg.save_script()
from airflow.operators.docker_operator import DockerOperator
for task_name in dag_ploomber:
DockerOperator(task_id=task_name,
image="continuumio/miniconda3",
api_version="auto",
auto_remove=True,
# command="sh /home/letyndr/airflow/dags/ml-intermediate/script.sh",
command="sleep 600",
docker_url="unix://var/run/docker.sock",
volumes=[
"/home/letyndr/airflow/dags/ml-intermediate:/home/letyndr/airflow/dags/ml-intermediate:rw",
"/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"
],
working_dir=script_cfg.paths.project,
dag=dag_airflow,
container_name=task_name,
)
for task_name in dag_ploomber:
task_ploomber = dag_ploomber[task_name]
task_airflow = dag_airflow.get_task(task_name)
for upstream in task_ploomber.upstream:
task_airflow.set_upstream(dag_airflow.get_task(upstream))
dag = dag_airflow
```
When I execute this DAG using Airflow, I get the error that the docker does not find the `/home/letyndr/airflow/dags/ml-intermediate/script.sh` script. I changed the execution command of the docker operator `sleep 600` to enter to the container and check the files in the container with the corrects paths.
**What you expected to happen**: Basically to share the files of the host with the docker container to execute a shell script within the container.
When I'm in the container I can go to this path /home/letyndr/airflow/dags/ml-intermediate/ for example, but I don't see the files that are supposed to be there.
**What do you think went wrong?**
I tried to replicate how Airflow implements Docker SDK for Python
This is my one replication of the docker implementation:
```
import docker
client = docker.APIClient()
# binds = {
# "/home/letyndr/airflow/dags": {
# "bind": "/home/letyndr/airflow/dags",
# "mode": "rw"
# },
# "/home/letyndr/airflow-data/ml-intermediate": {
# "bind": "/home/letyndr/airflow-data/ml-intermediate",
# "mode": "rw"
# }
# }
binds = ["/home/letyndr/airflow/dags:/home/letyndr/airflow/dags:rw",
"/home/letyndr/airflow-data/ml-intermediate:/home/letyndr/airflow-data/ml-intermediate:rw"]
container = client.create_container(
image="continuumio/miniconda3",
command="sleep 600",
volumes=["/home/letyndr/airflow/dags", "/home/letyndr/airflow-data/ml-intermediate"],
host_config=client.create_host_config(binds=binds),
working_dir="/home/letyndr/airflow/dags",
name="simple_example",
)
client.start(container=container.get("Id"))
```
What I found was that mounting volumes only works if it's set `host_config` and `volumes`, the problem is that the implementation on Airflow just set `host_config` but not `volumes`. I added the parameter on the method `create_container`, it worked.
**How to reproduce it**:
Mount a volume from a host and use the files inside the directory in the docker container.
| https://github.com/apache/airflow/issues/12537 | https://github.com/apache/airflow/pull/15843 | ac3454e4f169cdb0e756667575153aca8c1b6981 | 12995cfb9a90d1f93511a4a4ab692323e62cc318 | "2020-11-21T21:13:03Z" | python | "2021-05-17T15:03:18Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,509 | ["Dockerfile", "Dockerfile.ci", "IMAGES.rst", "scripts/ci/libraries/_build_images.sh"] | Add support for using Cloud Build in breeze build-image | **Description**
I would like to build airflow images using external services like Google Cloud Build. In this way I don't have to run the build on local machine and have to design custom CI pipelines to build it (especially for dev purposes).
**Use case / motivation**
Building production image can take time and running this on notebooks sounds like not an optimal way of doing it. We can take advantage of system dedicated to do this task like Google Cloud Build.
**Related Issues**
N/A
| https://github.com/apache/airflow/issues/12509 | https://github.com/apache/airflow/pull/12534 | 370e7d07d1ed1a53b73fe878425fdcd4c71a7ed1 | 37548f09acb91edd041565f52051f58610402cb3 | "2020-11-20T15:18:02Z" | python | "2020-11-21T18:21:43Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,485 | ["UPDATING.md", "airflow/models/dag.py", "airflow/www/views.py", "tests/www/views/test_views.py"] | Optionally clear downstream failed tasks when marking success | **Description**
For a DAG that looks like this:
```
A >> B
```
If A fails, B goes into "upstream_failed" state. If a user then marks A "success", B will remain in "upstream_failed" state. It will not automatically start running. This scenario often happens if the failure of A is dealt with outside of Airflow and the user does not want Airflow to run A again. But he usually expect Airflow to run B after A is marked "success".
**Use case / motivation**
After A is marked "success", its downstream tasks B should be cleared automatically and get ready to be scheduled. To avoid changing this behaviour completely, there can be a toggle button next to "Success" that lets the user decide if he wants Airflow to automatically clear downstream tasks that are in "upstream_failed" state. | https://github.com/apache/airflow/issues/12485 | https://github.com/apache/airflow/pull/13037 | 2bca8a5425c234b04fdf32d6c50ae3a91cd08262 | 6b2524fe733c42fe586405c84a496ac4aaf8fe49 | "2020-11-19T13:46:48Z" | python | "2021-05-29T17:11:19Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,468 | ["airflow/api_connexion/endpoints/provider_endpoint.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/api_connexion/schemas/provider_schema.py", "airflow/security/permissions.py", "airflow/www/security.py", "docs/apache-airflow/security/access-control.rst", "tests/api_connexion/endpoints/test_provider_endpoint.py"] | Add API to query for providers | https://github.com/apache/airflow/issues/12468 | https://github.com/apache/airflow/pull/13394 | b8c0fde38a7df9d00185bf53e9f303b98fd064dc | 9dad095f735cd6a73bcbf57324d7ed79f622858c | "2020-11-19T01:19:32Z" | python | "2021-05-07T13:47:09Z" |
|
closed | apache/airflow | https://github.com/apache/airflow | 12,448 | ["airflow/models/serialized_dag.py", "tests/models/test_dagrun.py"] | General Error Deleting DAG run | Using a 2.0.0 beta 2 build...

Log from webserver:
```
[2020-11-18 14:58:52,712] {interface.py:713} ERROR - Delete record error: Dependency rule tried to blank-out primary key column 'job.id' on instance '<SchedulerJob at 0x7fc68ba7f990>'
Traceback (most recent call last):
File "/usr/local/lib/python3.7/site-packages/flask_appbuilder/models/sqla/interface.py", line 698, in delete
self.session.commit()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/scoping.py", line 163, in do
return getattr(self.registry(), name)(*args, **kwargs)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 1042, in commit
self.transaction.commit()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 504, in commit
self._prepare_impl()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 483, in _prepare_impl
self.session.flush()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2536, in flush
self._flush(objects)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2678, in _flush
transaction.rollback(_capture_exception=True)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/langhelpers.py", line 70, in __exit__
with_traceback=exc_tb,
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/util/compat.py", line 182, in raise_
raise exception
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/session.py", line 2638, in _flush
flush_context.execute()
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 422, in execute
rec.execute(self)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/unitofwork.py", line 538, in execute
self.dependency_processor.process_deletes(uow, states)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/dependency.py", line 547, in process_deletes
state, child, None, True, uowcommit, False
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/dependency.py", line 604, in _synchronize
sync.clear(dest, self.mapper, self.prop.synchronize_pairs)
File "/usr/local/lib/python3.7/site-packages/sqlalchemy/orm/sync.py", line 88, in clear
"column '%s' on instance '%s'" % (r, orm_util.state_str(dest))
AssertionError: Dependency rule tried to blank-out primary key column 'job.id' on instance '<SchedulerJob at 0x7fc68ba7f990>'
172.19.0.1 - - [18/Nov/2020:14:58:52 +0000] "POST /dagrun/delete/147 HTTP/1.1" 302 415 "http://localhost:8080/dagrun/list/?_flt_3_dag_id=airflow_community_data_refresh&_flt_3_state=success" "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/86.0.4240.193 Safari/537.36"
``` | https://github.com/apache/airflow/issues/12448 | https://github.com/apache/airflow/pull/12586 | c6467ba12d4a94027137e3173097d73be56c5d12 | 08251c145d9ace8fe2f1e1309833eb4d4ad54eca | "2020-11-18T14:56:14Z" | python | "2020-11-25T02:07:10Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,413 | ["airflow/migrations/versions/e165e7455d70_add_description_field_to_variable.py", "airflow/models/variable.py", "airflow/www/views.py"] | Description Field for Variables | add text column to explain what the variable is used for
same as https://github.com/apache/airflow/issues/10840 just for Variables.
| https://github.com/apache/airflow/issues/12413 | https://github.com/apache/airflow/pull/15194 | d944f5a59daf0c4512f87369c6eabb27666376bf | 925ef281856630f5231baf42a30a5eb18f0b7ca0 | "2020-11-17T18:32:16Z" | python | "2021-04-12T15:33:25Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,401 | ["airflow/www/views.py", "tests/www/views/test_views_connection.py"] | Duplicate connections UI | In the UI it would be nice to duplicate the selected connection, in the menu "With Selected".
The copy would be set with the same name plus some suffix (_copy, _1, whatever) and it is useful for when you have a connection with all equal fields except something in a particular one. | https://github.com/apache/airflow/issues/12401 | https://github.com/apache/airflow/pull/15574 | 621ef766ffc77c7bd51c81fe802fa019a44094ea | 2011da25a50edfcdf7657ec172f57ae6e43ca216 | "2020-11-17T13:39:57Z" | python | "2021-06-17T15:22:44Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,369 | ["airflow/cli/commands/connection_command.py", "tests/cli/commands/test_connection_command.py"] | In 2.0.0b2/master, CLI "airflow connections add" is not handling invalid URIs properly |
<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 2.0.0b2
**Environment**: Docker
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
In current 2.0.0b2, `airflow connections add` is not handling invalid URIs properly.
For example, running CLI `airflow connections add --conn-uri xyz conn_dummy` will
- create the connection “conn_dummy” successfully (which should not be, IMO)
- in the connection created, it only has one attribute filled, which is “schema”, but the value added is “yz” (the value we provide is "xyz", i.e., the 1st element is removed. because of https://github.com/apache/airflow/blob/master/airflow/models/connection.py#L213)
<img width="552" alt="airflow_0284e070228e___" src="https://user-images.githubusercontent.com/11539188/99155673-519dca00-26ba-11eb-9517-31c1a238476f.png">
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
In my opinion, a validation should be done first to ensure the `conn-uri` provided is valid (at least have “scheme” and “netloc” available in the `urlparse` result), and reject if it's an invalid URI.
<!-- What do you think went wrong? -->
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12369 | https://github.com/apache/airflow/pull/12370 | 9ba8b31ed5001ea6522657b86ce3dfd2a75d594c | 823b3aace298ab13d2e19b8f0bf1c426ff20407c | "2020-11-14T19:47:00Z" | python | "2020-11-15T10:47:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,344 | ["airflow/www/static/js/base.js"] | 2.0.0b2: time-label is wrongly formatted | When hovering over the timestamp.
<img width="224" alt="image" src="https://user-images.githubusercontent.com/8430564/99089658-e1197f00-25cd-11eb-86b5-6633a410ce3b.png">
| https://github.com/apache/airflow/issues/12344 | https://github.com/apache/airflow/pull/12447 | 7ca0b6f121c9cec6e25de130f86a56d7c7fbe38c | b584adbe1120d5e2b8a9fae3356a97f13ed70cd3 | "2020-11-13T15:33:30Z" | python | "2020-11-18T14:58:06Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,329 | ["airflow/providers/google/cloud/transfers/gcs_to_bigquery.py", "tests/providers/google/cloud/transfers/test_gcs_to_bigquery.py"] | GCSToBigQueryOperator - allow upload to existing table without specifying schema_fields/schema_object | **Description**
We would like to be able to load data to **existing** BigQuery tables without having to specify schema_fields/schema_object in `GCSToBigQueryOperator` since table already exists.
**Use case / motivation**
<details>
<summary>BigQuery load job usage details and problem explanation</summary>
We create BigQuery tables/datasets through CI process (terraform managed), with the help of Airflow we updating those tables with data.
To update tables with data we can use:
Airflow 2.0 operator: GCSToBigQueryOperator
Airflow 1.* operator (deprecated) GoogleCloudStorageToBigQueryOperator
However those operator require to specify one of 3 things:
1. schema_fields - fields that define table schema
2. schema_object - a GCS object path pointing to a .json file that contains the schema for the table
3. or autodetect=True
In other cases it will:
```
raise ValueError('At least one of `schema_fields`, `schema_object`, '
'or `autodetect**=True**` must be passed.')
```
_Note: it does not actually says that `autodetect` must be `True` in exception - but according to code it must be specified as True, or schema should be used otherwise._
But we already have created table, and we can update it using
`bq load` command. (which Airflow operators mentioned above are using internally)
When using `bq load` - you also have an option to specify **schema**. The schema can be a local JSON file, or it can be typed inline as part of the command. You can also use the `--autodetect` flag instead of supplying a schema definition.
https://cloud.google.com/bigquery/docs/loading-data-cloud-storage-csv#bq
When you specify `--autodetect` as True - BigQuery will try to give random names to your columns, e.g.: 'string_field_0', 'int_field_1' - and if you are trying to load into **existing** table - `bq load` will fail with error:
'Cannot add fields (field: string_field_0)'}.'
Same way Airflow operators like 'GCSToBigQueryOperator' will fail.
However **there is also an option NOT to specify** `--autodetect` or specify `--autodetect=False` and in this case `bq load` will load from CloudStorage to **existing** BQ table without problems.
</details>
Proposal/TL;DR:
Add an option **not** to specify `--autodetect` or specify `--autodetect=False` when write_disposition='WRITE_APPEND' is used in GCSToBigQueryOperator. This will allow an operator to update **existing** BigQuery table without having to specify schema within the operator itself (it will just be updating **existing** table with data).
| https://github.com/apache/airflow/issues/12329 | https://github.com/apache/airflow/pull/28564 | 3df204b53247f51d94135698defdbae0d359921f | d7f5f6d737cf06cc8e216f523534aeaf48065793 | "2020-11-12T22:38:28Z" | python | "2022-12-24T14:26:59Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,313 | ["airflow/www/templates/airflow/dags.html"] | The "Filter tags" multi-select container can't hold the selected tags while switching between views | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**: 1.10.12 & 2.0.0b2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**: N/A
- **OS** (e.g. from /etc/os-release): Linux mint 20
- **Kernel** (e.g. `uname -a`): 5.4.0-52-generic
- **Install tools**:
- **Others**:
**What happened**:

<!-- (please include exact error messages if you can) -->
**What you expected to happen**: The selected tags should stay the same while switching between `All`, `Active`, and `Paused`
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12313 | https://github.com/apache/airflow/pull/12324 | af19b126e94876c371553f6a7cfae6b1102f79fd | 7f828b03ccef848c740f8013c56a856708ed505c | "2020-11-12T16:56:23Z" | python | "2020-11-12T21:45:25Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,309 | ["airflow/operators/python.py", "docs/concepts.rst", "tests/operators/test_python.py"] | TaskGroup does not support dynamically generated tasks | **Apache Airflow version**:
2.0 / master
**Environment**:
breeze
**What happened**:
Using this DAG:
```py
from airflow.operators.bash import BashOperator
from airflow.operators.python import task
from airflow.models import DAG
from airflow.utils.task_group import TaskGroup
@task
def show():
print("Cats are awesome!")
with DAG(
"using_task_group",
default_args={'owner': 'airflow'},
start_date=days_ago(2),
schedule_interval=None,
) as dag3:
start_task = BashOperator(
task_id="start_task",
bash_command="echo start",
)
end_task = BashOperator(
task_id="end_task",
bash_command="echo end",
)
with TaskGroup(group_id="show_tasks") as tg1:
previous_show = show()
for _ in range(100):
next_show = show()
previous_show >> next_show
previous_show = next_show
```
I get:
```
Broken DAG: [/files/dags/test.py] Traceback (most recent call last):
File "/opt/airflow/airflow/models/baseoperator.py", line 410, in __init__
task_group.add(self)
File "/opt/airflow/airflow/utils/task_group.py", line 140, in add
raise DuplicateTaskIdFound(f"Task id '{key}' has already been added to the DAG")
airflow.exceptions.DuplicateTaskIdFound: Task id 'show_tasks.show' has already been added to the DAG
```
If I remove the task group the task are generated as expected.
**What you expected to happen**:
I expect to be able to generate tasks dynamically using TaskGroup and task decoratos.
**How to reproduce it**:
Use the DAG from above.
**Anything else we need to know**:
N/A
| https://github.com/apache/airflow/issues/12309 | https://github.com/apache/airflow/pull/12312 | 823b3aace298ab13d2e19b8f0bf1c426ff20407c | 39ea8722c04fb1c0b286b4248a52e8d974a47b30 | "2020-11-12T13:43:16Z" | python | "2020-11-15T11:28:04Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,306 | ["airflow/api_connexion/openapi/v1.yaml", "tests/api_connexion/endpoints/test_task_instance_endpoint.py"] | Getting list of taskInstances without start_date, end_date and state fails in REST API | Getting a list of taskinstances where some taskinstances does not have start date, end date and state fails
**Apache Airflow version**: 2.0
**What happened**:
Calling the endpoint `http://localhost:28080/api/v1//dags/~/dagRuns/~/taskInstances/list` with dags whose tasks instances have not started fails.
```
{
"detail": "None is not of type 'string'\n\nFailed validating 'type' in schema['allOf'][0]['properties']['task_instances']['items']['properties']['start_date']:\n {'format': 'datetime', 'type': 'string'}\n\nOn instance['task_instances'][9349]['start_date']:\n None",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.readthedocs.io/en/latest/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
**What you expected to happen**:
I expected to see a list of task instances like this:
```
{
"task_instances": [
{
"dag_id": "latest_only",
"duration": 0.481884,
"end_date": "2020-11-11T22:03:03.822310+00:00",
"execution_date": "2020-11-10T12:00:00+00:00",
"executor_config": "{}",
"hostname": "7b6c973dde4b",
"max_tries": 0,
"operator": "LatestOnlyOperator",
"pid": 1943,
"pool": "default_pool",
"pool_slots": 1,
"priority_weight": 2,
"queue": "default",
"queued_when": "2020-11-11T22:03:02.648502+00:00",
"sla_miss": null,
"start_date": "2020-11-11T22:03:03.340426+00:00",
"state": "success",
"task_id": "latest_only",
"try_number": 1,
"unixname": "root"
},
{
"dag_id": "example_branch_dop_operator_v3",
"duration": null,
"end_date": null,
"execution_date": "2020-11-11T02:18:00+00:00",
"executor_config": "{}",
"hostname": "",
"max_tries": 0,
"operator": "BranchPythonOperator",
"pid": null,
"pool": "default_pool",
"pool_slots": 1,
"priority_weight": 3,
"queue": "default",
"queued_when": null,
"sla_miss": null,
"start_date": null,
"state": null,
"task_id": "condition",
"try_number": 0,
"unixname": "root"
}
],
"total_entries": 2
}
```
**How to reproduce it**:
Call the endpoint `http://localhost:28080/api/v1//dags/~/dagRuns/~/taskInstances/list` with dags whose tasks instances have not started. | https://github.com/apache/airflow/issues/12306 | https://github.com/apache/airflow/pull/12453 | e9cfa393ab05e9d9546e5c203d4b39af5586031d | 44282350716e322a20e9da069f63d4f2fa6fbc42 | "2020-11-12T08:08:38Z" | python | "2020-11-20T11:36:46Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,298 | ["docs/apache-airflow/concepts/dags.rst"] | Add docs around when to use TaskGroup vs SubDag and potentially listing PROs and CONS. | It would be great for users to know when they should use TaskGroup vs SubDag. A section somewhere in docs would be great or even better a Matrix / table to explain difference would be aweomse.
What are the PROs and CONs of each
**TaskGroup**: https://airflow.readthedocs.io/en/latest/concepts.html#taskgroup
**SubDags**: https://airflow.readthedocs.io/en/latest/concepts.html#subdags | https://github.com/apache/airflow/issues/12298 | https://github.com/apache/airflow/pull/20700 | 129b4d2ac2ce09d42fb487f8a9aaac7eb7901a05 | 6b0c52898555641059e149c5ff0d9b46b2d45379 | "2020-11-11T23:56:49Z" | python | "2022-01-09T21:58:26Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,292 | ["airflow/operators/subdag.py", "tests/operators/test_subdag_operator.py"] | Deprecate SubDags in Favor of TaskGroups | Once TaskGroups (https://airflow.readthedocs.io/en/latest/concepts.html#taskgroup) that would be released in Airflow 2.0 reach feature parity with SubDags and we have wide adoption and feedback from users about Taskgroups we should deprecate Subdags and remove them eventually in Airflow 3.0.
Discussion Thread: https://lists.apache.org/thread.html/ra52746f9c8274469d343b5f0251199de776e75ab75ded6830886fb6a%40%3Cdev.airflow.apache.org%3E | https://github.com/apache/airflow/issues/12292 | https://github.com/apache/airflow/pull/17488 | 69d2ed65cb7c9384d309ae5e499d5798c2c3ac96 | b311bc0237b28c6d23f54137ed46f46e7fa5893f | "2020-11-11T23:15:54Z" | python | "2021-08-08T12:05:31Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,276 | ["airflow/api_connexion/openapi/v1.yaml", "airflow/models/log.py", "tests/api_connexion/endpoints/test_event_log_endpoint.py"] | (REST API) List event logs endpoint is not working |
The list event log endpoint is not working
**Apache Airflow version**: 2.0
**Environment**: Breeze
**What happened**:
Calling the list event logs endpoint `http://localhost:28080/api/v1/eventLogs` returns 500 status code with the below message:
```
{
"detail": "None is not of type 'string'\n\nFailed validating 'type' in schema['allOf'][0]['properties']['event_logs']['items']['properties']['dag_id']:\n {'description': 'The DAG ID', 'readOnly': True, 'type': 'string'}\n\nOn instance['event_logs'][0]['dag_id']:\n None",
"status": 500,
"title": "Response body does not conform to specification",
"type": "https://airflow.readthedocs.io/en/latest/stable-rest-api-ref.html#section/Errors/Unknown"
}
```
**What you expected to happen**:
I expected it to produce a list of event logs:
```
{
"event_logs": [
{
"dag_id": null,
"event": "cli_webserver",
"event_log_id": 482,
"execution_date": null,
"extra": "{\"host_name\": \"e24b454f002a\", \"full_command\": \"['/usr/local/bin/airflow', 'webserver']\"}",
"owner": "root",
"task_id": null,
"when": "2020-11-11T03:28:48.722814+00:00"
},
{
"dag_id": null,
"event": "cli_scheduler",
"event_log_id": 483,
"execution_date": null,
"extra": "{\"host_name\": \"e24b454f002a\", \"full_command\": \"['/usr/local/bin/airflow', 'scheduler']\"}",
"owner": "root",
"task_id": null,
"when": "2020-11-11T03:32:18.684231+00:00"
},
],
"total_entries": 2
}
```
**How to reproduce it**:
1. Start airflow webserver and scheduler in breeze
2. Call the endpoint `http://localhost:28080/api/v1/eventLogs`
3. Check the response
| https://github.com/apache/airflow/issues/12276 | https://github.com/apache/airflow/pull/12287 | 388736bf97a4313f81aadbeecbb99e5fcb145c31 | 0d37c59669afebe774355a310a889e3cfa378862 | "2020-11-11T04:21:14Z" | python | "2020-11-11T19:10:13Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,263 | ["scripts/ci/images/ci_build_dockerhub.sh"] | Tagged Production image build fails in DockerHub | Beta1 and Beta 2 were built manually as the scripts for building images in DockerHub ALMOST worked :). they failed on pulling tagged -build images.
```
Cloning into '.'...
Warning: Permanently added the RSA host key for IP address '140.82.114.4' to the list of known hosts.
Switched to a new branch '2.0.0b2'
Executing build hook...
DOCKER_REPO=index.docker.io/apache/airflow
DOCKERHUB_USER=apache
DOCKERHUB_REPO=airflow
DOCKER_TAG=2.0.0b2-python3.7
Detected PYTHON_MAJOR_MINOR_VERSION=3.7
+++ date +%s
++ START_SCRIPT_TIME=1605013728
++ build_images::determine_docker_cache_strategy
++ [[ -z '' ]]
++ [[ false == \t\r\u\e ]]
++ export DOCKER_CACHE=pulled
++ DOCKER_CACHE=pulled
++ readonly DOCKER_CACHE
++ verbosity::print_info
++ [[ false == \t\r\u\e ]]
++ verbosity::print_info 'Using pulled cache strategy for the build.'
++ [[ false == \t\r\u\e ]]
++ verbosity::print_info
++ [[ false == \t\r\u\e ]]
++ initialization::get_environment_for_builds_on_ci
++ [[ false == \t\r\u\e ]]
++ export CI_TARGET_REPO=apache/airflow
++ CI_TARGET_REPO=apache/airflow
++ export CI_TARGET_BRANCH=master
++ CI_TARGET_BRANCH=master
++ export CI_BUILD_ID=0
++ CI_BUILD_ID=0
++ export CI_JOB_ID=0
++ CI_JOB_ID=0
++ export CI_EVENT_TYPE=pull_request
++ CI_EVENT_TYPE=pull_request
++ export CI_REF=refs/head/master
++ CI_REF=refs/head/master
++ [[ false == \t\r\u\e ]]
++ build_images::get_docker_image_names
++ export PYTHON_BASE_IMAGE_VERSION=3.7
++ PYTHON_BASE_IMAGE_VERSION=3.7
++ export PYTHON_BASE_IMAGE=python:3.7-slim-buster
++ PYTHON_BASE_IMAGE=python:3.7-slim-buster
++ export AIRFLOW_CI_BASE_TAG=master-python3.7-ci
++ AIRFLOW_CI_BASE_TAG=master-python3.7-ci
++ export AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ export AIRFLOW_CI_PYTHON_IMAGE=apache/airflow:python3.7-master
++ AIRFLOW_CI_PYTHON_IMAGE=apache/airflow:python3.7-master
++ export AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ AIRFLOW_CI_IMAGE=apache/airflow:master-python3.7-ci
++ [[ 2.0.0b2-python3.7 == '' ]]
++ export AIRFLOW_PROD_BASE_TAG=2.0.0b2-python3.7
++ AIRFLOW_PROD_BASE_TAG=2.0.0b2-python3.7
++ export AIRFLOW_PROD_IMAGE=apache/airflow:2.0.0b2-python3.7
++ AIRFLOW_PROD_IMAGE=apache/airflow:2.0.0b2-python3.7
++ export AIRFLOW_PROD_BUILD_IMAGE=apache/airflow:2.0.0b2-python3.7-build
++ AIRFLOW_PROD_BUILD_IMAGE=apache/airflow:2.0.0b2-python3.7-build
++ export AIRFLOW_PROD_IMAGE_KUBERNETES=apache/airflow:2.0.0b2-python3.7-kubernetes
++ AIRFLOW_PROD_IMAGE_KUBERNETES=apache/airflow:2.0.0b2-python3.7-kubernetes
++ export AIRFLOW_PROD_IMAGE_DEFAULT=apache/airflow:master
++ AIRFLOW_PROD_IMAGE_DEFAULT=apache/airflow:master
++ export BUILT_CI_IMAGE_FLAG_FILE=/src/b3fpsnmwartmqn9f6rbzfxh/.build/master/.built_3.7
++ BUILT_CI_IMAGE_FLAG_FILE=/src/b3fpsnmwartmqn9f6rbzfxh/.build/master/.built_3.7
++ initialization::make_constants_read_only
++ readonly PYTHON_MAJOR_MINOR_VERSION
++ readonly WEBSERVER_HOST_PORT
++ readonly POSTGRES_HOST_PORT
++ readonly MYSQL_HOST_PORT
++ readonly HOST_USER_ID
++ readonly HOST_GROUP_ID
++ readonly HOST_AIRFLOW_SOURCES
++ readonly HOST_HOME
++ readonly HOST_OS
++ readonly ENABLE_KIND_CLUSTER
++ readonly KUBERNETES_MODE
++ readonly KUBERNETES_VERSION
++ readonly KIND_VERSION
++ readonly HELM_VERSION
++ readonly KUBECTL_VERSION
++ readonly BACKEND
++ readonly POSTGRES_VERSION
++ readonly MYSQL_VERSION
++ readonly MOUNT_LOCAL_SOURCES
++ readonly INSTALL_AIRFLOW_VERSION
++ readonly INSTALL_AIRFLOW_REFERENCE
++ readonly DB_RESET
++ readonly VERBOSE
++ readonly START_AIRFLOW
++ readonly PRODUCTION_IMAGE
++ readonly SKIP_BUILDING_PROD_IMAGE
++ readonly CI_BUILD_ID
++ readonly CI_JOB_ID
++ readonly IMAGE_TAG
++ readonly AIRFLOW_PRE_CACHED_PIP_PACKAGES
++ readonly INSTALL_AIRFLOW_VIA_PIP
++ readonly AIRFLOW_LOCAL_PIP_WHEELS
++ readonly AIRFLOW_CONSTRAINTS_REFERENCE
++ readonly AIRFLOW_CONSTRAINTS_LOCATION
++ readonly ADDITIONAL_AIRFLOW_EXTRAS
++ readonly ADDITIONAL_PYTHON_DEPS
++ readonly AIRFLOW_PRE_CACHED_PIP_PACKAGES
++ readonly DEV_APT_COMMAND
++ readonly DEV_APT_DEPS
++ readonly ADDITIONAL_DEV_APT_COMMAND
++ readonly ADDITIONAL_DEV_APT_DEPS
++ readonly ADDITIONAL_DEV_APT_ENV
++ readonly RUNTIME_APT_COMMAND
++ readonly RUNTIME_APT_DEPS
++ readonly ADDITIONAL_RUNTIME_APT_COMMAND
++ readonly ADDITIONAL_RUNTIME_APT_DEPS
++ readonly ADDITIONAL_RUNTIME_APT_ENV
++ readonly DOCKERHUB_USER
++ readonly DOCKERHUB_REPO
++ readonly DOCKER_CACHE
++ readonly USE_GITHUB_REGISTRY
++ readonly GITHUB_REGISTRY
++ readonly GITHUB_REGISTRY_WAIT_FOR_IMAGE
++ readonly GITHUB_REGISTRY_PULL_IMAGE_TAG
++ readonly GITHUB_REGISTRY_PUSH_IMAGE_TAG
++ readonly GITHUB_REPOSITORY
++ readonly GITHUB_TOKEN
++ readonly GITHUB_USERNAME
++ readonly FORWARD_CREDENTIALS
++ readonly USE_GITHUB_REGISTRY
++ readonly EXTRA_STATIC_CHECK_OPTIONS
++ readonly VERSION_SUFFIX_FOR_PYPI
++ readonly VERSION_SUFFIX_FOR_SVN
++ readonly PYTHON_BASE_IMAGE_VERSION
++ readonly PYTHON_BASE_IMAGE
++ readonly AIRFLOW_CI_BASE_TAG
++ readonly AIRFLOW_CI_IMAGE
++ readonly AIRFLOW_CI_IMAGE_DEFAULT
++ readonly AIRFLOW_PROD_BASE_TAG
++ readonly AIRFLOW_PROD_IMAGE
++ readonly AIRFLOW_PROD_BUILD_IMAGE
++ readonly AIRFLOW_PROD_IMAGE_KUBERNETES
++ readonly AIRFLOW_PROD_IMAGE_DEFAULT
++ readonly BUILT_CI_IMAGE_FLAG_FILE
++ readonly INIT_SCRIPT_FILE
++ traps::add_trap start_end::script_end EXIT HUP INT TERM
++ trap=start_end::script_end
++ shift
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p EXIT
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' EXIT
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p HUP
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' HUP
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p INT
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' INT
++ for signal in '"${@}"'
++ local handlers
+++ cut -f2 -d \'
+++ trap -p TERM
++ handlers='rm -rf -- '
++ trap 'start_end::script_end;rm -rf -- ' TERM
+ [[ 2.0.0b2-python3.7 == *python*-ci ]]
+ [[ 2.0.0b2-python3.7 == *python* ]]
+ echo
+ echo 'Building prod image'
Building prod image
+ echo
+ rm -rf /src/b3fpsnmwartmqn9f6rbzfxh/.build
+ build_images::prepare_prod_build
+ [[ -n '' ]]
+ [[ -n '' ]]
+ EXTRA_DOCKER_PROD_BUILD_FLAGS=("--build-arg" "AIRFLOW_CONSTRAINTS_REFERENCE=${DEFAULT_CONSTRAINTS_BRANCH}")
+ [[ 3.6 == \3\.\7 ]]
+ export DEFAULT_CI_IMAGE=
+ DEFAULT_CI_IMAGE=
+ export THE_IMAGE_TYPE=PROD
+ THE_IMAGE_TYPE=PROD
+ export 'IMAGE_DESCRIPTION=Airflow production'
+ IMAGE_DESCRIPTION='Airflow production'
+ export AIRFLOW_EXTRAS=async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv
+ AIRFLOW_EXTRAS=async,aws,azure,celery,dask,elasticsearch,gcp,kubernetes,mysql,postgres,redis,slack,ssh,statsd,virtualenv
+ readonly AIRFLOW_EXTRAS
+ export AIRFLOW_IMAGE=apache/airflow:2.0.0b2-python3.7
+ AIRFLOW_IMAGE=apache/airflow:2.0.0b2-python3.7
+ readonly AIRFLOW_IMAGE
+ [[ false == \t\r\u\e ]]
+ AIRFLOW_BRANCH_FOR_PYPI_PRELOADING=master
+ sanity_checks::go_to_airflow_sources
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ pushd /src/b3fpsnmwartmqn9f6rbzfxh
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
++ pwd
+ verbosity::print_info 'Running in host in /src/b3fpsnmwartmqn9f6rbzfxh'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ build_images::build_prod_images
+ build_images::print_build_info
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info 'Airflow 2.0.0b2 Python: 3.7. Image description: Airflow production'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info
+ [[ false == \t\r\u\e ]]
+ [[ false == \t\r\u\e ]]
+ push_pull_remove_images::pull_prod_images_if_needed
+ [[ pulled == \p\u\l\l\e\d ]]
+ [[ false == \t\r\u\e ]]
+ [[ false == \t\r\u\e ]]
+ push_pull_remove_images::pull_image_if_not_present_or_forced apache/airflow:2.0.0b2-python3.7-build
+ local IMAGE_TO_PULL=apache/airflow:2.0.0b2-python3.7-build
+ local IMAGE_HASH
++ docker images -q apache/airflow:2.0.0b2-python3.7-build
+ IMAGE_HASH=
+ local PULL_IMAGE=false
+ [[ -z '' ]]
+ PULL_IMAGE=true
+ [[ true == \t\r\u\e ]]
+ echo
+ echo 'Pulling the image apache/airflow:2.0.0b2-python3.7-build'
Pulling the image apache/airflow:2.0.0b2-python3.7-build
+ echo
+ docker pull apache/airflow:2.0.0b2-python3.7-build
+ verbosity::store_exit_on_error_status
+ exit_on_error=false
+ [[ ehuxB == *e* ]]
+ exit_on_error=true
+ set +e
+ [[ false == \t\r\u\e ]]
+ [[ true == \f\a\l\s\e ]]
+ /usr/bin/docker pull apache/airflow:2.0.0b2-python3.7-build
++ tee -a /tmp/tmp.zuzptQhEPi/out.log
++ tee -a /tmp/tmp.zuzptQhEPi/out.log
Error response from daemon: manifest for apache/airflow:2.0.0b2-python3.7-build not found: manifest unknown: manifest unknown
+ res=1
+ [[ 1 == \0 ]]
+ [[ true == \f\a\l\s\e ]]
+ verbosity::restore_exit_on_error_status
+ [[ true == \t\r\u\e ]]
+ set -e
+ unset exit_on_error
+ return 1
+ start_end::script_end
+ local exit_code=1
+ [[ 1 != 0 ]]
+ [[ -f /tmp/tmp.zuzptQhEPi/out.log ]]
+ [[ true == \f\a\l\s\e ]]
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info '###########################################################################################'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info ' EXITING WITH STATUS CODE 1'
+ [[ false == \t\r\u\e ]]
+ verbosity::print_info '###########################################################################################'
+ [[ false == \t\r\u\e ]]
+ [[ true == \t\r\u\e ]]
+ set +x
build hook failed! (1)
``` | https://github.com/apache/airflow/issues/12263 | https://github.com/apache/airflow/pull/12378 | 561e4594913395c52a331e44ec2f638b55fa513e | 0038660fddc99f454a8ecf4de53be9848f7ddc5d | "2020-11-10T17:55:26Z" | python | "2020-11-16T01:26:36Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,262 | ["airflow/www/compile_assets_if_needed.sh", "scripts/in_container/entrypoint_ci.sh"] | Automate asset compiling when entering breeze | Currenlty when any of the assets change, you need to remember about recompiling assets when entering breeze. This can be easily automated so that the rebuild will happen every time assets change.
Linked issue #12258 | https://github.com/apache/airflow/issues/12262 | https://github.com/apache/airflow/pull/13292 | af611e76ed18b51b32bc72dfe4d97af6b21e7d5f | a1e06ac7a65dddfee26e39b4191766d9c840c1fe | "2020-11-10T17:43:31Z" | python | "2020-12-23T20:08:47Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,261 | ["scripts/ci/images/ci_build_dockerhub.sh", "scripts/ci/images/ci_prepare_prod_image_on_ci.sh", "scripts/ci/libraries/_build_images.sh", "scripts/ci/libraries/_initialization.sh"] | Make production image use provider packages to build for production. | Currently, the production image we use is build directly from sources. This is great for development, but id does not really test if airflow will work if installed from packages. We should be able to build the packages locally and build the image using whl packages as sources of pip packages.
This will be as close to someone installing airflow from those packages manually
Once we use it for testing, we should also consider to build the image published in DockerHub to be built from those packages but it adds some complications in building scripts. This is possible but we have to test it first.
That needs two parts:
- [x] changing images in CI to be built from packages
- [x] changing images in DockerHub to be built from packages
| https://github.com/apache/airflow/issues/12261 | https://github.com/apache/airflow/pull/12908 | ef523b4c2bdb10a10ad042d36a57157cb5d85723 | f9e9ad2b096ff9d8ee78224333f799ca3968b6bd | "2020-11-10T17:32:52Z" | python | "2020-12-08T11:45:03Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,255 | [".pre-commit-config.yaml", "BREEZE.rst", "CONTRIBUTING.rst", "INSTALL", "STATIC_CODE_CHECKS.rst", "breeze-complete", "docs/installation.rst", "scripts/ci/pre_commit/pre_commit_check_extras_have_providers.py", "setup.py"] | Statsd (tbc. if more extras) tries to install provider package where it is missing | **Apache Airflow version**: 2.0.0b2
**What happened**:
Starting in 2.0.0b2, extras that aren't also providers like `statsd` prevents Airflow from being installed.
**How to reproduce it**:
```
$ docker run --rm -ti python:3.6 bash
# pip install apache-airflow[statsd]==2.0.0b2
Collecting apache-airflow[statsd]==2.0.0b2
Downloading apache_airflow-2.0.0b2-py3-none-any.whl (4.5 MB)
...
Collecting statsd<4.0,>=3.3.0; extra == "statsd"
Downloading statsd-3.3.0-py2.py3-none-any.whl (11 kB)
ERROR: Could not find a version that satisfies the requirement apache-airflow-providers-statsd; extra == "statsd" (from apache-airflow[statsd]==2.0.0b2) (from versions: none)
ERROR: No matching distribution found for apache-airflow-providers-statsd; extra == "statsd" (from apache-airflow[statsd]==2.0.0b2)
```
I believe this is from https://github.com/apache/airflow/pull/12233 | https://github.com/apache/airflow/issues/12255 | https://github.com/apache/airflow/pull/12265 | cbf49848afa43f693d890ac5cce8000aa723d2bf | 348510f86b8ee6b7d89c1355258e61095a6a29e9 | "2020-11-10T15:50:43Z" | python | "2020-11-11T16:13:57Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,254 | ["airflow/www/templates/airflow/dag.html"] | "Log" button on graph view popup doesn't open the logs view | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:2.0.0b2
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`): NA
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release): macOS
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
Clicking on the "Log" button in the popup for task instance in graph view doesn't link to the logs view.
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
The log view should show up.
<!-- What do you think went wrong? -->
**How to reproduce it**:
1. Run any dag.
2. Open Graph VIew.
3. Click on any task.
4. Click on the "Log" button.
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
Check this in action [here](https://youtu.be/fXEQ-yOwMrM).
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12254 | https://github.com/apache/airflow/pull/12268 | 0cd1c846b2fb4d830b87e11b884094ee4765ab22 | 938c512c6d9e05865cb6c8e0098ba6dba5ef55b6 | "2020-11-10T15:38:38Z" | python | "2020-11-10T22:18:15Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,231 | ["setup.py"] | Extras installation was lost in the last rebase | The #11526 was badly rebased just before beta1 relase and few
lines installing the providers were lost.
| https://github.com/apache/airflow/issues/12231 | https://github.com/apache/airflow/pull/12233 | 45587a664433991b01a24bf0210116c3b562adc7 | 5912d0cae7033b3e2549280677dd60faa53be5e7 | "2020-11-10T07:52:14Z" | python | "2020-11-10T10:52:55Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,201 | ["airflow/api_connexion/endpoints/dag_run_endpoint.py", "airflow/api_connexion/parameters.py", "airflow/api_connexion/schemas/dag_run_schema.py", "airflow/api_connexion/schemas/task_instance_schema.py", "tests/api_connexion/endpoints/test_dag_run_endpoint.py", "tests/api_connexion/endpoints/test_task_instance_endpoint.py", "tests/api_connexion/test_parameters.py"] | (REST API)Triggering a dagrun with naive datetime raises an html page instead of json | Triggering a dagrun with naive datetime raises an HTML page instead of JSON
**Apache Airflow version**:2.0
**What happened**:
An HTML page was returned instead of json when the REST API was triggered with a naive datetime
**What you expected to happen**:
I expected it to raise error status 400 indicating that it's a bad request
**How to reproduce it**:
1. Start the web server and scheduler inside breeze.
2. Make a post request to this endpoint: http://localhost:28080/api/v1/dags/example_bash_operator/dagRuns
to trigger dagruns using this request body: {"execution_date": "2020-11-09T16:25:56.939143"}. or any naive datetime
3. See that an HTML is raised instead of Json
| https://github.com/apache/airflow/issues/12201 | https://github.com/apache/airflow/pull/12248 | 0d37c59669afebe774355a310a889e3cfa378862 | 7478e18ee55eed80a2b8a8f7599b95d0955986c0 | "2020-11-09T15:38:13Z" | python | "2020-11-11T19:10:44Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,197 | [".gitignore", "dev/provider_packages/MANIFEST_TEMPLATE.in.jinja2", "dev/provider_packages/SETUP_TEMPLATE.py.jinja2", "dev/provider_packages/prepare_provider_packages.py"] | Provider packages don't include datafiles | The amazon and google providers don't include necessary datafiles in them.
They were previously included in the sdist via MANIFEST.in (see https://github.com/apache/airflow/pull/12196) and in the bdist via include_package_data from the top level setup.py
Both of these are currently missing.
I've put this against 2.0.0-beta1, it _could_ be changed separately as providers are separate releases. | https://github.com/apache/airflow/issues/12197 | https://github.com/apache/airflow/pull/12200 | 55c401dbf9f7bf8730158f52a5ccc4aa7ab06381 | 765cbbcd76900fd0777730aaba637058b089ac95 | "2020-11-09T13:35:08Z" | python | "2020-11-09T15:57:01Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,168 | ["airflow/www/views.py", "tests/www/test_views.py"] | More meaningful XCom List View Name | In flask-appbuilder, the List View would use "List " + prettied model name as the title.
This makes sense for most of the cases, like `Variable`, `Connection`, etc.
But for `XCom`, the title is not making much sense.
(I'm testing with 2.0.0a2. But actually this issue exists in earlier 1.x versions as well).

<!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
<!-- (please include exact error messages if you can) -->
**What you expected to happen**:
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12168 | https://github.com/apache/airflow/pull/12169 | bedaf5353d87604d12442ecb0f481cb4d85d9ab4 | 8d5ad6969ff68deea3aca3c98b4a982597f330a0 | "2020-11-07T19:50:14Z" | python | "2020-11-07T22:04:52Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,150 | ["scripts/ci/images/ci_prepare_prod_image_on_ci.sh", "setup.py"] | Production image has only amazon and google providers installed | When "production" image is prepared for, only amazon and google providers are installed from sources.
**Apache Airflow version**:
master
**What you expected to happen**:
All providers should be installed
**How to reproduce it**:
```
./breeze --production-image --python 3.6
```
Then:
```
./breeze --production-image --python 3.6 shell bash
```
then
```
ls -la ~/.local/lib/python3.6/site-packages/airflow/providers/
amazon
google
```
UPDATE:
They are not even installed:
```
.
./amazon
./amazon/aws
./amazon/aws/hooks
./amazon/aws/hooks/batch_waiters.json
./google
./google/cloud
./google/cloud/example_dags
./google/cloud/example_dags/example_bigquery_query.sql
./google/cloud/example_dags/example_cloud_build.yaml
./google/cloud/example_dags/example_spanner.sql
```
| https://github.com/apache/airflow/issues/12150 | https://github.com/apache/airflow/pull/12154 | 92e405e72922cc569a2e41281df9d055c3a7855d | eaac361f3bb29cd3bbd459488fcf31c28ed8fb2b | "2020-11-07T09:27:40Z" | python | "2020-11-09T12:26:24Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,131 | ["airflow/api_connexion/exceptions.py", "airflow/api_connexion/openapi/v1.yaml", "airflow/www/extensions/init_views.py", "tests/api_connexion/test_error_handling.py"] | Return Json for all `Not Found` views in REST API | Currently if an API URL that does not exist is requested, an html page is returned instead of JSON
**Apache Airflow version**:2.0
**What happened**:
Making a request against an endpoint that does not exist returns an html page
**What you expected to happen**:
I expected it to return 404 error in JSON indicating that the URL was not found
**How to reproduce it**:
1. Start airflow website and scheduler in breeze.
2. Make a request against an endpoint that does not exist e.g http://localhost:28080/api/v1/getconnectiond
3. Notice that an html page is returned instead of a json
cc: @mik-laj
| https://github.com/apache/airflow/issues/12131 | https://github.com/apache/airflow/pull/12305 | 763b40d223e5e5512494a97f8335e16960e6adc3 | 966ee7d99466ba841e5fd7cd29f050ae59e75c85 | "2020-11-06T10:24:51Z" | python | "2020-11-18T04:35:22Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,121 | ["airflow/api_connexion/schemas/dag_schema.py", "setup.cfg", "tests/api_connexion/endpoints/test_dag_endpoint.py", "tests/api_connexion/schemas/test_dag_schema.py"] | Bugs in REST API get dag details endpoint. | Here are some bugs in dag details endpoint.
1. Calling this endpoint `dags/{dag_id}/details` returns an error due to `doc_md` field that's not nullable.
2. The dag details endpoint `GET /dags/{dag_id}/details` does not return a json if dag does not exist.
3. This endpoint does not return a `file_token` which is needed to get the source code of a DAG. And also expected judging from the documentation.
**Apache Airflow version**: 2.0
**What happened**:
1. Calling this endpoint GET /dags/{dag_id}/details returns an error
2. When the above is fixed and you look for the details of a dag that does not exist, this endpoint returns a webpage instead of json.
3. Calling the endpoint with a dag ID that exists doesn't return a file_token
**What you expected to happen**:
1. I expected it to return the details of the dag
2. I expected it to return a json indicating that the dag was not found if dag doesn't exist
3. I expected it to return a file_token in the list of fields when the dag exists
**How to reproduce it**:
Run airflow in breeze.
Make a request to `/dags/{dag_id}/details`. It will return an error.
Go to openapi.yaml and set `doc_md` to be nullable.
Make another request to see that it returns a result without a file_token.
Change the dag id to a dag ID that does not exist.
Make another request and see that it returns a webpage.
| https://github.com/apache/airflow/issues/12121 | https://github.com/apache/airflow/pull/12463 | c34ef853c890e08f5468183c03dc8f3f3ce84af2 | 20843ff89ddbdac45f7ecf9913c4e38685089eb4 | "2020-11-06T00:24:52Z" | python | "2020-11-20T16:28:55Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,081 | ["airflow/providers/papermill/operators/papermill.py", "tests/providers/papermill/operators/test_papermill.py"] | PapermillOperator does not take user defined macros | <!--
Welcome to Apache Airflow! For a smooth issue process, try to answer the following questions.
Don't worry if they're not all applicable; just try to include what you can :-)
If you need to include code snippets or logs, please put them in fenced code
blocks. If they're super-long, please use the details tag like
<details><summary>super-long log</summary> lots of stuff </details>
Please delete these comment blocks before submitting the issue.
-->
<!--
IMPORTANT!!!
PLEASE CHECK "SIMILAR TO X EXISTING ISSUES" OPTION IF VISIBLE
NEXT TO "SUBMIT NEW ISSUE" BUTTON!!!
PLEASE CHECK IF THIS ISSUE HAS BEEN REPORTED PREVIOUSLY USING SEARCH!!!
Please complete the next sections or the issue will be closed.
These questions are the first thing we need to know to understand the context.
-->
**Apache Airflow version**:
1.10.12
**Kubernetes version (if you are using kubernetes)** (use `kubectl version`):
1.15.11
**Environment**:
- **Cloud provider or hardware configuration**:
- **OS** (e.g. from /etc/os-release):
- **Kernel** (e.g. `uname -a`):
- **Install tools**:
- **Others**:
**What happened**:
- When initialize DAG, I defined `user_defined_macros` with my macros.
- If I use user defined macro on PapermillOperator, it does not find user defined macros.
<!-- (please include exact error messages if you can) -->
```
[2020-11-04 16:35:10,822] {taskinstance.py:1150} ERROR - 'seoul_date' is undefined
Traceback (most recent call last):
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/airflow/models/taskinstance.py", line 984, in _run_raw_task
result = task_copy.execute(context=context)
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/airflow/operators/papermill_operator.py", line 57, in execute
parameters=self.inlets[i].parameters,
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/airflow/lineage/datasets.py", line 69, in __getattr__
).render(**self.context)
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/jinja2/environment.py", line 1090, in render
self.environment.handle_exception()
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/jinja2/environment.py", line 832, in handle_exception
reraise(*rewrite_traceback_stack(source=source))
File "/opt/pyenv/versions/3.6.8/lib/python3.6/site-packages/jinja2/_compat.py", line 28, in reraise
raise value.with_traceback(tb)
File "<template>", line 1, in top-level template code
jinja2.exceptions.UndefinedError: 'seoul_date' is undefined
```
**What you expected to happen**:
PapermillOperator should find user defined macro.
<!-- What do you think went wrong? -->
**How to reproduce it**:
<!---
As minimally and precisely as possible. Keep in mind we do not have access to your cluster or dags.
If you are using kubernetes, please attempt to recreate the issue using minikube or kind.
## Install minikube/kind
- Minikube https://minikube.sigs.k8s.io/docs/start/
- Kind https://kind.sigs.k8s.io/docs/user/quick-start/
If this is a UI bug, please provide a screenshot of the bug or a link to a youtube video of the bug in action
You can include images using the .md style of

To record a screencast, mac users can use QuickTime and then create an unlisted youtube video with the resulting .mov file.
--->
**Anything else we need to know**:
PapermillOperator uses `Notebook` class which derived from `DataSet`
https://github.com/apache/airflow/blob/65df1e802190d262b5e18fa9bc2e055768b96e28/airflow/operators/papermill_operator.py#L27
And `DataSet` seems not taking user defined macro from DAG.
<!--
How often does this problem occur? Once? Every time etc?
Any relevant logs to include? Put them here in side a detail tag:
<details><summary>x.log</summary> lots of stuff </details>
-->
| https://github.com/apache/airflow/issues/12081 | https://github.com/apache/airflow/pull/18357 | 1008d8bf8acf459dbc692691a589c27fa4567123 | f382a79adabb2372a1ca5d9e43ed34afd9dec33d | "2020-11-04T08:34:48Z" | python | "2021-09-20T05:06:47Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,042 | ["docs/installation.rst"] | Fix description of some extra packages | **Description**
The PR #12023 added consistency check for documentation and setup.py. However few packages need additional description:
- [x] pagerduty
- [ ] plexus
- [ ] sentry
- [x] singularity
- [ ] tableau
- [ ] virtualenv
This is a great "good-first-issue"
**Use case / motivation**
Descriptions should be complete.
| https://github.com/apache/airflow/issues/12042 | https://github.com/apache/airflow/pull/12141 | 128c9918b5f79cb46a563b77e803c29548c4319c | 4df25e94de74b6f430b1f05235715b99e56ab3db | "2020-11-02T14:22:33Z" | python | "2020-11-06T19:41:32Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,035 | ["docs/apache-airflow-providers-amazon/connections/aws.rst"] | Support AssumeRoleWithWebIdentity for AWS provider | **Description**
Support of [AssumeRoleWithWebIdentity](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role_with_web_identity) for AWS provider when running Airflow workers in EKS.
**Use case / motivation**
This feature will allow us to use [IRSA](https://aws.amazon.com/blogs/opensource/introducing-fine-grained-iam-roles-service-accounts/) with Airflow Pods running on EKS and x-account _assumeRole_.
A connection type aws with empty username & password and the following extra parameters
```json
{
"role_arn": "<role_arn>",
"region_name": "<region>",
"aws_session_token": "file://$AWS_WEB_IDENTITY_TOKEN_FILE"
}
```
will retrieve temporary credentials using method `sts-assume-role-with-web-identity` (see [boto3 documentation](https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/sts.html#STS.Client.assume_role_with_web_identity) of this method)
| https://github.com/apache/airflow/issues/12035 | https://github.com/apache/airflow/pull/17283 | 8fa4a8b587a3672156110fc4cf5c04bdf6830867 | c52e4f35170cc3cd9d597110bc24c270af553ca2 | "2020-11-02T11:03:23Z" | python | "2021-08-01T22:38:32Z" |
closed | apache/airflow | https://github.com/apache/airflow | 12,030 | ["airflow/www/static/css/main.css"] | Airflow WebUI style broken | **Apache Airflow version**:
2.0.0 - master branch
**Environment**:
- **OS** (e.g. from /etc/os-release): MacOS 10.15.7
- **Browser**: Safari Version 13.1.3 (15609.4.1)
**What happened**:
The row in tis table is wider than 100% width
<img width="1440" alt="Screenshot 2020-11-02 at 11 11 39" src="https://user-images.githubusercontent.com/9528307/97856398-85f9a980-1cfc-11eb-89ed-d739ecb0f51c.png">
**What you expected to happen**:
I expect the row to not overflow.
**How to reproduce it**:
Go to http://0.0.0.0:28080/taskinstance/list
**Anything else we need to know**:
N/A
| https://github.com/apache/airflow/issues/12030 | https://github.com/apache/airflow/pull/12048 | b72bd4ae6b0e62689b126463396bf1e59f068543 | a1a1fc9f32940a8abbfc4a12d32321d75ac8268c | "2020-11-02T10:16:20Z" | python | "2020-11-02T16:55:55Z" |
Subsets and Splits
No saved queries yet
Save your SQL queries to embed, download, and access them later. Queries will appear here once saved.