위의 지난 편에 이어서 airflow에서 https를 적용하는 방법에 대해서 적어보겠습니다.
순서는 아래와 같다고 했었습니다.
- SSL 인증서 확보
- 인증서를 airflow 서버로 복사
- airflow ui 접속을 위한 도메인 지정
- DNS에서 도메인 등록
- docker-compose.yml 파일 수정
- airflow 컨테이너 재생성
- airflow UI 접속 테스트
SSL 인증서 확보
이 글에서는 사용할 수 있는 인증서는 있다는 가정하게 진행하겠습니다. 인증서는 파일 확장자가 key, crt로 끝나는 2개의 파일만 있으면 됩니다.
무료 SSL 인증서가 필요하다면 여기를 클릭해서 발급 받는 방법을 확인하시면 됩니다.
prodo.key
prodo.crt
인증서 파일을 airflow 서버로 복사
key, crt 파일을 서버의 특정 디렉토리로 복사하면 됩니다. 이 글에서는 docke-compose를 이용하기 때문에 마운트할 디렉토리로 아래오 같이 복사하면 됩니다.
/home/prodo/airflow/certs
airflow ui 접속을 위한 도메인 지정
https로 접속한 도메인을 지정하면 됩니다. 예를 들어서 아래와 같이만들면 됩니다.
https://airflow.prodo.com
DNS에서 도메인 등록
폐쇄망이라면 위에서 지정한 서브 도메인을 내부 DNS 서버에 등록해줘야 합니다. 그렇지 않으면 크롬에서 안전하지 않은 사이트라는 표시가 계속 나옵니다.
docker-compose.yml 파일 수정
docker-compose.yml 파일을 수정합니다. 첫 번째는 인증서 파일이 있는 경로를 마운트 하는것이고, 두 번째는 SSL 인증서를 사용하기 위한 설정을 하는 것입니다. 아래 yml 파일은 airflow 공식 docker 페이지의 것을 사용했습니다.(참고)
아래와 같은 부분을 수정하면됩니다. 자세한 정보는 아래 전체 docker-compose.yml 파일을 참고하시면 됩니다.
environment:
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_CERT: '/opt/airflow/certs/prodo.crt'
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_KEY: '/opt/airflow/certs/prod.key'
AIRFLOW__WEBSERVER__BASE_URL: 'https://localhost:8080'
volumes:
- /home/prodo/airflow/certs:/opt/airflow/certs
전체 docker-compose.yml 파일
version: '3'
x-airflow-common:
&airflow-common
# In order to add custom dependencies or upgrade provider packages you can use your extended image.
# Comment the image line, place your Dockerfile in the directory where you placed the docker-compose.yaml
# and uncomment the "build" line below, Then run `docker-compose build` to build the images.
image: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.3.4}
# build: .
environment:
&airflow-common-env
AIRFLOW__CORE__EXECUTOR: CeleryExecutor
AIRFLOW__DATABASE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
# For backward compatibility, with Airflow <2.3
AIRFLOW__CORE__SQL_ALCHEMY_CONN: postgresql+psycopg2://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__RESULT_BACKEND: db+postgresql://airflow:airflow@postgres/airflow
AIRFLOW__CELERY__BROKER_URL: redis://:@redis:6379/0
AIRFLOW__CORE__FERNET_KEY: ''
AIRFLOW__CORE__DAGS_ARE_PAUSED_AT_CREATION: 'true'
AIRFLOW__CORE__LOAD_EXAMPLES: 'true'
AIRFLOW__API__AUTH_BACKENDS: 'airflow.api.auth.backend.basic_auth'
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_CERT: '/opt/airflow/certs/prodo.crt'
AIRFLOW__WEBSERVER__WEB_SERVER_SSL_KEY: '/opt/airflow/certs/prodo.key'
AIRFLOW__WEBSERVER__BASE_URL: 'https://localhost:8080'
_PIP_ADDITIONAL_REQUIREMENTS: ${_PIP_ADDITIONAL_REQUIREMENTS:-}
volumes:
- ./dags:/opt/airflow/dags
- ./logs:/opt/airflow/logs
- ./plugins:/opt/airflow/plugins
user: "${AIRFLOW_UID:-50000}:0"
depends_on:
&airflow-common-depends-on
redis:
condition: service_healthy
postgres:
condition: service_healthy
services:
postgres:
image: postgres:13
environment:
POSTGRES_USER: airflow
POSTGRES_PASSWORD: airflow
POSTGRES_DB: airflow
volumes:
- postgres-db-volume:/var/lib/postgresql/data
healthcheck:
test: ["CMD", "pg_isready", "-U", "airflow"]
interval: 5s
retries: 5
restart: always
redis:
image: redis:latest
expose:
- 6379
healthcheck:
test: ["CMD", "redis-cli", "ping"]
interval: 5s
timeout: 30s
retries: 50
restart: always
airflow-webserver:
<<: *airflow-common
command: webserver
ports:
- 8080:8080
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:8080/health"]
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-scheduler:
<<: *airflow-common
command: scheduler
healthcheck:
test: ["CMD-SHELL", 'airflow jobs check --job-type SchedulerJob --hostname "$${HOSTNAME}"']
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-worker:
<<: *airflow-common
command: celery worker
healthcheck:
test:
- "CMD-SHELL"
- 'celery --app airflow.executors.celery_executor.app inspect ping -d "celery@$${HOSTNAME}"'
interval: 10s
timeout: 10s
retries: 5
environment:
<<: *airflow-common-env
# Required to handle warm shutdown of the celery workers properly
# See https://airflow.apache.org/docs/docker-stack/entrypoint.html#signal-propagation
DUMB_INIT_SETSID: "0"
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-triggerer:
<<: *airflow-common
command: triggerer
healthcheck:
test: ["CMD-SHELL", 'airflow jobs check --job-type TriggererJob --hostname "$${HOSTNAME}"']
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
airflow-init:
<<: *airflow-common
entrypoint: /bin/bash
# yamllint disable rule:line-length
command:
- -c
- |
function ver() {
printf "%04d%04d%04d%04d" $${1//./ }
}
airflow_version=$$(AIRFLOW__LOGGING__LOGGING_LEVEL=INFO && gosu airflow airflow version)
airflow_version_comparable=$$(ver $${airflow_version})
min_airflow_version=2.2.0
min_airflow_version_comparable=$$(ver $${min_airflow_version})
if (( airflow_version_comparable < min_airflow_version_comparable )); then
echo
echo -e "\033[1;31mERROR!!!: Too old Airflow version $${airflow_version}!\e[0m"
echo "The minimum Airflow version supported: $${min_airflow_version}. Only use this or higher!"
echo
exit 1
fi
if [[ -z "${AIRFLOW_UID}" ]]; then
echo
echo -e "\033[1;33mWARNING!!!: AIRFLOW_UID not set!\e[0m"
echo "If you are on Linux, you SHOULD follow the instructions below to set "
echo "AIRFLOW_UID environment variable, otherwise files will be owned by root."
echo "For other operating systems you can get rid of the warning with manually created .env file:"
echo " See: https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#setting-the-right-airflow-user"
echo
fi
one_meg=1048576
mem_available=$$(($$(getconf _PHYS_PAGES) * $$(getconf PAGE_SIZE) / one_meg))
cpus_available=$$(grep -cE 'cpu[0-9]+' /proc/stat)
disk_available=$$(df / | tail -1 | awk '{print $$4}')
warning_resources="false"
if (( mem_available < 4000 )) ; then
echo
echo -e "\033[1;33mWARNING!!!: Not enough memory available for Docker.\e[0m"
echo "At least 4GB of memory required. You have $$(numfmt --to iec $$((mem_available * one_meg)))"
echo
warning_resources="true"
fi
if (( cpus_available < 2 )); then
echo
echo -e "\033[1;33mWARNING!!!: Not enough CPUS available for Docker.\e[0m"
echo "At least 2 CPUs recommended. You have $${cpus_available}"
echo
warning_resources="true"
fi
if (( disk_available < one_meg * 10 )); then
echo
echo -e "\033[1;33mWARNING!!!: Not enough Disk space available for Docker.\e[0m"
echo "At least 10 GBs recommended. You have $$(numfmt --to iec $$((disk_available * 1024 )))"
echo
warning_resources="true"
fi
if [[ $${warning_resources} == "true" ]]; then
echo
echo -e "\033[1;33mWARNING!!!: You have not enough resources to run Airflow (see above)!\e[0m"
echo "Please follow the instructions to increase amount of resources available:"
echo " https://airflow.apache.org/docs/apache-airflow/stable/start/docker.html#before-you-begin"
echo
fi
mkdir -p /sources/logs /sources/dags /sources/plugins
chown -R "${AIRFLOW_UID}:0" /sources/{logs,dags,plugins}
exec /entrypoint airflow version
# yamllint enable rule:line-length
environment:
<<: *airflow-common-env
_AIRFLOW_DB_UPGRADE: 'true'
_AIRFLOW_WWW_USER_CREATE: 'true'
_AIRFLOW_WWW_USER_USERNAME: ${_AIRFLOW_WWW_USER_USERNAME:-airflow}
_AIRFLOW_WWW_USER_PASSWORD: ${_AIRFLOW_WWW_USER_PASSWORD:-airflow}
_PIP_ADDITIONAL_REQUIREMENTS: ''
user: "0:0"
volumes:
- .:/sources
airflow-cli:
<<: *airflow-common
profiles:
- debug
environment:
<<: *airflow-common-env
CONNECTION_CHECK_MAX_COUNT: "0"
# Workaround for entrypoint issue. See: https://github.com/apache/airflow/issues/16252
command:
- bash
- -c
- airflow
# You can enable flower by adding "--profile flower" option e.g. docker-compose --profile flower up
# or by explicitly targeted on the command line e.g. docker-compose up flower.
# See: https://docs.docker.com/compose/profiles/
flower:
<<: *airflow-common
command: celery flower
profiles:
- flower
ports:
- 5555:5555
healthcheck:
test: ["CMD", "curl", "--fail", "http://localhost:5555/"]
interval: 10s
timeout: 10s
retries: 5
restart: always
depends_on:
<<: *airflow-common-depends-on
airflow-init:
condition: service_completed_successfully
volumes:
postgres-db-volume:
airflow 컨테이너 재생성
아래 명령어를 이용해서 airflow 컨테이너를 재생성 합니다.
# 컨테이너 삭제
docker-compose down
# 컨테이너 생성
docker-compose up
airflow UI 접속 테스트
아래와 같이 https가 포함된 URL로 접속해보면 정상적으로 https가 적용된 것을 확인할 수 있습니다.
이상으로 컨테이너 기반 airflow에 SSL 인증서를 적용해보는 방법을 알아봤습니다.
'빅데이터(BigData) > Airflow' 카테고리의 다른 글
airflow UI에서 session timeout 설정하기 (0) | 2022.09.20 |
---|---|
airflow UI에 사용자 비밀번호 검증 기능 추가하기 1편 (0) | 2022.09.06 |
airflow UI에 SSL 인증서를 적용하여 http를 https로 변경하기 1편 (0) | 2022.09.02 |
docker-compose에서 airflow로 cli 실행하기 (0) | 2022.07.22 |
airflow에서 log 디렉토리 변경하기 (0) | 2021.08.20 |