Skip to content

Commit

Permalink
fix folder_dir not showing on logs for DbtDocsS3LocalOperator (as…
Browse files Browse the repository at this point in the history
…tronomer#856)

## Description

DAG logs is not printing `folder_dir`
```python
generate_dbt_docs_aws = DbtDocsS3Operator(
    task_id="generate_dbt_docs_aws",
    project_dir=f"{AIRFLOW_HOME}/dags/dbt/dbt-project",
    profile_config=profile_config,
    env=env_vars,
    append_env=True,
    # docs-specific arguments
    connection_id="aws_default",
    bucket_name="airflow-data-xxxxxxxx-us-east-2",
    folder_dir="dags/dbt/dbt-project/target",
    dag=dbt_cosmos_dag,
)
```

![image](https://github.com/astronomer/astronomer-cosmos/assets/6994647/f57cba9a-4b87-4c36-b580-1b2ddde1eb78)


## Related Issue(s)
None
<!-- If this PR closes an issue, you can use a keyword to auto-close.
-->
<!-- i.e. "closes #0000" -->

## Breaking Change?
No
<!-- If this introduces a breaking change, specify that here. -->

## Checklist

- [ ] I have made corresponding changes to the documentation (if
required)
- [ ] I have added tests that prove my fix is effective or that my
feature works
  • Loading branch information
PrimOox authored and arojasb3 committed Jul 14, 2024
1 parent 390dcad commit c0f8aa7
Showing 1 changed file with 2 additions and 2 deletions.
4 changes: 2 additions & 2 deletions cosmos/operators/local.py
Original file line number Diff line number Diff line change
Expand Up @@ -574,9 +574,9 @@ def upload_to_cloud_storage(self, project_dir: str) -> None:
)

for filename in self.required_files:
logger.info("Uploading %s to %s", filename, f"s3://{self.bucket_name}/{filename}")

key = f"{self.folder_dir}/{filename}" if self.folder_dir else filename
s3_path = f"s3://{self.bucket_name}/{key}"
logger.info("Uploading %s to %s", filename, s3_path)

hook.load_file(
filename=f"{target_dir}/{filename}",
Expand Down

0 comments on commit c0f8aa7

Please sign in to comment.