本文探讨了如何使用 Google Cloud Functions 在三层架构中处理数据。该解决方案通过 Google Composer 进行编排,并使用 GitHub Actions 实现自动部署。我们将逐步介绍所使用的工具、部署过程和管道步骤,为构建基于云的端到端数据管道提供清晰的指南。
在设置项目之前,请确保您具备以下条件:
要安全地访问您的 GCP 项目和资源,请在 GitHub Actions 中设置以下机密:
1. In project repository, menu **Settings** 2. **Security**, 3. **Secrets and variables**,click in access **Action** 4. **New repository secret**, type a **name** and **value** for secret.
更多详情,请访问:
https://docs.github.com/pt/actions/security-for-github-actions/security-guides/using-secrets-in-github-actions
在 GCP 中创建一个具有 Cloud Functions、Composer、BigQuery 和 Cloud Storage 权限的服务帐号。授予必要的角色,例如:
管道自动化了整个部署过程,确保所有组件都正确设置。以下是 GitHub Actions 文件中关键作业的细分,每个作业负责部署的不同方面。
enable-services: runs-on: ubuntu-22.04 steps: - uses: actions/checkout@v2 # Step to Authenticate with GCP - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Configure Cloud SDK - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Set up python 3.8 uses: actions/setup-python@v2 with: python-version: 3.8.16 # Step to Create GCP Bucket - name: Enable gcp service api's run: |- gcloud services enable ${{ env.GCP_SERVICE_API_0 }} gcloud services enable ${{ env.GCP_SERVICE_API_1 }} gcloud services enable ${{ env.GCP_SERVICE_API_2 }} gcloud services enable ${{ env.GCP_SERVICE_API_3 }} gcloud services enable ${{ env.GCP_SERVICE_API_4 }}
deploy-buckets: needs: [enable-services] runs-on: ubuntu-22.04 timeout-minutes: 10 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker # Step to Create GCP Bucket - name: Create Google Cloud Storage - datalake run: |- if ! gsutil ls -p ${{ secrets.PROJECT_ID }} gs://${{ secrets.BUCKET_DATALAKE }} &> /dev/null; \ then \ gcloud storage buckets create gs://${{ secrets.BUCKET_DATALAKE }} --default-storage-class=nearline --location=${{ env.REGION }} else echo "Cloud Storage : gs://${{ secrets.BUCKET_DATALAKE }} already exists" ! fi # Step to Upload the file to GCP Bucket - transient files - name: Upload transient files to Google Cloud Storage run: |- TARGET=${{ env.INPUT_FOLDER }} BUCKET_PATH=${{ secrets.BUCKET_DATALAKE }}/${{ env.INPUT_FOLDER }} gsutil cp -r $TARGET gs://${BUCKET_PATH}
deploy-cloud-function: needs: [enable-services, deploy-buckets] runs-on: ubuntu-22.04 steps: - uses: actions/checkout@v2 # Step to Authenticate with GCP - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Configure Cloud SDK - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Set up python 3.10 uses: actions/setup-python@v2 with: python-version: 3.10.12 #cloud_function_scripts/csv_to_parquet - name: Create cloud function - ${{ env.CLOUD_FUNCTION_1_NAME }} run: |- cd ${{ env.FUNCTION_SCRIPTS }}/${{ env.CLOUD_FUNCTION_1_NAME }} gcloud functions deploy ${{ env.CLOUD_FUNCTION_1_NAME }} \ --gen2 \ --cpu=${{ env.FUNCTION_CPU }} \ --memory=${{ env.FUNCTION_MEMORY }} \ --runtime ${{ env.PYTHON_FUNCTION_RUNTIME }} \ --trigger-http \ --region ${{ env.REGION }} \ --entry-point ${{ env.CLOUD_FUNCTION_1_NAME }} - name: Create cloud function - ${{ env.CLOUD_FUNCTION_2_NAME }} run: |- cd ${{ env.FUNCTION_SCRIPTS }}/${{ env.CLOUD_FUNCTION_2_NAME }} gcloud functions deploy ${{ env.CLOUD_FUNCTION_2_NAME }} \ --gen2 \ --cpu=${{ env.FUNCTION_CPU }} \ --memory=${{ env.FUNCTION_MEMORY }} \ --runtime ${{ env.PYTHON_FUNCTION_RUNTIME }} \ --trigger-http \ --region ${{ env.REGION }} \ --entry-point ${{ env.CLOUD_FUNCTION_2_NAME }} - name: Create cloud function - ${{ env.CLOUD_FUNCTION_3_NAME }} run: |- cd ${{ env.FUNCTION_SCRIPTS }}/${{ env.CLOUD_FUNCTION_3_NAME }} gcloud functions deploy ${{ env.CLOUD_FUNCTION_3_NAME }} \ --gen2 \ --cpu=${{ env.FUNCTION_CPU }} \ --memory=${{ env.FUNCTION_MEMORY }} \ --runtime ${{ env.PYTHON_FUNCTION_RUNTIME }} \ --trigger-http \ --region ${{ env.REGION }} \ --entry-point ${{ env.CLOUD_FUNCTION_3_NAME }}
deploy-composer-service-account: needs: [enable-services, deploy-buckets, deploy-cloud-function ] runs-on: ubuntu-22.04 timeout-minutes: 10 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Create service account run: |- if ! gcloud iam service-accounts list | grep -i ${{ env.SERVICE_ACCOUNT_NAME}} &> /dev/null; \ then \ gcloud iam service-accounts create ${{ env.SERVICE_ACCOUNT_NAME }} \ --display-name=${{ env.SERVICE_ACCOUNT_DESCRIPTION }} fi - name: Add permissions to service account run: |- gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/composer.user" gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/storage.objectAdmin" gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" # Permissão para criar e gerenciar ambientes Composer gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/composer.admin" gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/composer.worker" # Permissão para criar e configurar instâncias e recursos na VPC gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/compute.networkAdmin" # Permissão para interagir com o Cloud Storage, necessário para buckets e logs gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/storage.admin" # Permissão para criar e gerenciar recursos no projeto, como buckets e instâncias gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/editor" # Permissão para acessar e usar recursos necessários para o IAM gcloud projects add-iam-policy-binding ${{secrets.PROJECT_ID}} \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/iam.serviceAccountUser" gcloud functions add-iam-policy-binding ${{env.CLOUD_FUNCTION_1_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_1_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" gcloud functions add-iam-policy-binding ${{env.CLOUD_FUNCTION_2_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_2_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" gcloud functions add-iam-policy-binding ${{env.CLOUD_FUNCTION_3_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/cloudfunctions.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_3_NAME}} \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" SERVICE_NAME_1=$(gcloud functions describe ${{ env.CLOUD_FUNCTION_1_NAME }} --region=${{ env.REGION }} --format="value(serviceConfig.service)") gcloud run services add-iam-policy-binding $SERVICE_NAME_1 \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/run.invoker" SERVICE_NAME_2=$(gcloud functions describe ${{ env.CLOUD_FUNCTION_2_NAME }} --region=${{ env.REGION }} --format="value(serviceConfig.service)") gcloud run services add-iam-policy-binding $SERVICE_NAME_2 \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/run.invoker" SERVICE_NAME_3=$(gcloud functions describe ${{ env.CLOUD_FUNCTION_3_NAME }} --region=${{ env.REGION }} --format="value(serviceConfig.service)") gcloud run services add-iam-policy-binding $SERVICE_NAME_3 \ --region="${{env.REGION}}" \ --member="serviceAccount:${{env.SERVICE_ACCOUNT_NAME}}@${{secrets.PROJECT_ID}}.iam.gserviceaccount.com" \ --role="roles/run.invoker" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_1_NAME}} \ --region="${{env.REGION}}" \ --member="allUsers" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_2_NAME}} \ --region="${{env.REGION}}" \ --member="allUsers" gcloud functions add-invoker-policy-binding ${{env.CLOUD_FUNCTION_3_NAME}} \ --region="${{env.REGION}}" \ --member="allUsers"
deploy-bigquery-dataset-bigquery-tables: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account ] runs-on: ubuntu-22.04 timeout-minutes: 10 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Create Big Query Dataset run: |- if ! bq ls --project_id ${{ secrets.PROJECT_ID}} -a | grep -w ${{ env.BIGQUERY_DATASET}} &> /dev/null; \ then bq --location=${{ env.REGION }} mk \ --default_table_expiration 0 \ --dataset ${{ env.BIGQUERY_DATASET }} else echo "Big Query Dataset : ${{ env.BIGQUERY_DATASET }} already exists" ! fi - name: Create Big Query table run: |- TABLE_NAME_CUSTOMER=${{ env.BIGQUERY_DATASET}}.${{ env.BIGQUERY_TABLE_CUSTOMER}} c=0 for table in $(bq ls --max_results 1000 "${{ secrets.PROJECT_ID}}:${{ env.BIGQUERY_DATASET}}" | tail -n +3 | awk '{print }'); do # Determine the table type and file extension if bq show --format=prettyjson $BIGQUERY_TABLE_CUSTOMER | jq -r '.type' | grep -q -E "TABLE"; then echo "Dataset ${{ env.BIGQUERY_DATASET}} already has table named : $table " ! if [ "$table" == "${{ env.BIGQUERY_TABLE_CUSTOMER}}" ]; then echo "Dataset ${{ env.BIGQUERY_DATASET}} already has table named : $table " ! ((c=c+1)) fi else echo "Ignoring $table" continue fi done echo " contador $c " if [ $c == 0 ]; then echo "Creating table named : $table for Dataset ${{ env.BIGQUERY_DATASET}} " ! bq mk --table \ $TABLE_NAME_CUSTOMER \ ./big_query_schemas/customer_schema.json fi
deploy-composer-environment: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account, deploy-bigquery-dataset-bigquery-tables ] runs-on: ubuntu-22.04 timeout-minutes: 40 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker # Create composer environments - name: Create composer environments run: |- if ! gcloud composer environments list --project=${{ secrets.PROJECT_ID }} --locations=${{ env.REGION }} | grep -i ${{ env.COMPOSER_ENV_NAME }} &> /dev/null; then gcloud composer environments create ${{ env.COMPOSER_ENV_NAME }} \ --project ${{ secrets.PROJECT_ID }} \ --location ${{ env.REGION }} \ --environment-size ${{ env.COMPOSER_ENV_SIZE }} \ --image-version ${{ env.COMPOSER_IMAGE_VERSION }} \ --service-account ${{ env.SERVICE_ACCOUNT_NAME }}@${{ secrets.PROJECT_ID }}.iam.gserviceaccount.com else echo "Composer environment ${{ env.COMPOSER_ENV_NAME }} already exists!" fi # Create composer environments - name: Create composer variable PROJECT_ID run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION}} variables \ -- set PROJECT_ID ${{ secrets.PROJECT_ID }} - name: Create composer variable REGION run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set REGION ${{ env.REGION }} - name: Create composer variable CLOUD_FUNCTION_1_NAME run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }}\ --location ${{ env.REGION }} variables \ -- set CLOUD_FUNCTION_1_NAME ${{ env.CLOUD_FUNCTION_1_NAME }} - name: Create composer variable CLOUD_FUNCTION_2_NAME run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set CLOUD_FUNCTION_2_NAME ${{ env.CLOUD_FUNCTION_2_NAME }} - name: Create composer variable CLOUD_FUNCTION_3_NAME run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set CLOUD_FUNCTION_3_NAME ${{ env.CLOUD_FUNCTION_3_NAME }} - name: Create composer variable BUCKET_DATALAKE run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION}} variables \ -- set BUCKET_NAME ${{ secrets.BUCKET_DATALAKE }} - name: Create composer variable TRANSIENT_FILE_PATH run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set TRANSIENT_FILE_PATH ${{ env.TRANSIENT_FILE_PATH }} - name: Create composer variable BRONZE_PATH run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set BRONZE_PATH ${{ env.BRONZE_PATH }} - name: Create composer variable SILVER_PATH run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set SILVER_PATH ${{ env.SILVER_PATH }} - name: Create composer variable REGION_PROJECT_ID run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set REGION_PROJECT_ID "${{ env.REGION }}-${{ secrets.PROJECT_ID }}" - name: Create composer variable BIGQUERY_DATASET run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set BIGQUERY_DATASET "${{ env.BIGQUERY_DATASET }}" - name: Create composer variable BIGQUERY_TABLE_CUSTOMER run: |- gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} variables \ -- set BIGQUERY_TABLE_CUSTOMER "${{ env.BIGQUERY_TABLE_CUSTOMER }}"
deploy-composer-http-connection: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account, deploy-bigquery-dataset-bigquery-tables, deploy-composer-environment ] runs-on: ubuntu-22.04 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} # Step to Configure Docker to use the gcloud command-line tool as a credential helper - name: Configure Docker run: |- gcloud auth configure-docker - name: Create composer http connection HTTP_CONNECTION run: |- HOST="https://${{ env.REGION }}-${{ secrets.PROJECT_ID }}.cloudfunctions.net" gcloud composer environments run ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} connections \ -- add ${{ env.HTTP_CONNECTION }} \ --conn-type ${{ env.CONNECTION_TYPE }} \ --conn-host $HOST
deploy-dags: needs: [enable-services, deploy-buckets, deploy-cloud-function, deploy-composer-service-account, deploy-bigquery-dataset-bigquery-tables, deploy-composer-environment, deploy-composer-http-connection ] runs-on: ubuntu-22.04 steps: - name: Checkout uses: actions/checkout@v4 - name: Authorize GCP uses: 'google-github-actions/auth@v2' with: credentials_json: ${{ secrets.GCP_DEVOPS_SA_KEY }} # Step to Authenticate with GCP - name: Set up Cloud SDK uses: google-github-actions/setup-gcloud@v2 with: version: '>= 363.0.0' project_id: ${{ secrets.PROJECT_ID }} - name: Get Composer bucket name and Deploy DAG to Composer run: |- COMPOSER_BUCKET=$(gcloud composer environments describe ${{ env.COMPOSER_ENV_NAME }} \ --location ${{ env.REGION }} \ --format="value(config.dagGcsPrefix)") gsutil -m cp -r ./dags/* $COMPOSER_BUCKET/dags/
部署过程完成后,将提供以下资源:
此解决方案演示了如何利用 Google Cloud Functions、Composer 和 BigQuery 创建强大的三层数据处理管道。使用 GitHub Actions 的自动化可确保平稳、可重复的部署过程,从而更轻松地大规模管理基于云的数据管道。
以上是使用 Google Cloud Functions 与 Google Composer 进行三层数据处理并通过 GitHub Actions 进行自动部署的详细内容。更多信息请关注PHP中文网其他相关文章!