Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[internal] Temporarily disable integration tests due to ES-1302145 #3226

Merged
merged 1 commit into from
Nov 8, 2024

Conversation

nfx
Copy link
Collaborator

@nfx nfx commented Nov 8, 2024

Fix #3202
Fix #3204
Fix #3203

@nfx nfx requested a review from a team as a code owner November 8, 2024 13:02
@nfx nfx merged commit 478548c into main Nov 8, 2024
6 of 7 checks passed
@nfx nfx deleted the fix/3202 branch November 8, 2024 13:06
Copy link

github-actions bot commented Nov 8, 2024

❌ 94/99 passed, 2 flaky, 5 failed, 7 skipped, 5h11m56s total

❌ test_migrate_view: AssertionError: assert 3 == 4 (6m8.172s)
... (skipped 22700 bytes)
framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/grants.py", line 236, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 331, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 255, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 465, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_sv3c6`.`grants` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
13:18 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.grants] crawling new set of snapshot data for grants
13:18 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.tables] fetching tables inventory
13:18 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.udfs] fetching udfs inventory
13:18 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/hive_metastore/udfs.py", line 63, in _try_fetch
    for row in self._fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 331, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 255, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 465, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_sv3c6`.`udfs` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
13:18 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.udfs] crawling new set of snapshot data for udfs
13:18 DEBUG [databricks.labs.ucx.hive_metastore.udfs] [hive_metastore.dummy_szbtt] listing udfs
13:18 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.udfs] found 0 new records for udfs
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.grants] found 5 new records for grants
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.tables] fetching tables inventory
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.groups] fetching groups inventory
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] Inventory table not found
Traceback (most recent call last):
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/framework/crawlers.py", line 152, in _snapshot
    cached_results = list(fetcher())
  File "/home/runner/work/ucx/ucx/src/databricks/labs/ucx/workspace_access/groups.py", line 629, in _try_fetch
    for row in self._sql_backend.fetch(f"SELECT * FROM {escape_sql_identifier(self.full_name)}"):
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 331, in fetch_all
    execute_response = self.execute(
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 255, in execute
    self._raise_if_needed(status)
  File "/home/runner/work/ucx/ucx/.venv/lib/python3.10/site-packages/databricks/labs/lsql/core.py", line 465, in _raise_if_needed
    raise NotFound(error_message)
databricks.sdk.errors.platform.NotFound: [TABLE_OR_VIEW_NOT_FOUND] The table or view `hive_metastore`.`dummy_sv3c6`.`groups` cannot be found. Verify the spelling and correctness of the schema and catalog.
If you did not qualify the name with a schema, verify the current_schema() output, or qualify the name with the correct schema and catalog.
To tolerate the error on drop use DROP VIEW IF EXISTS or DROP TABLE IF EXISTS. SQLSTATE: 42P01; line 1 pos 14
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.groups] crawling new set of snapshot data for groups
13:22 INFO [databricks.labs.ucx.workspace_access.groups] Listing workspace groups (resource_type=WorkspaceGroup) with id,displayName,meta,externalId,members,roles,entitlements ...
13:22 INFO [databricks.labs.ucx.workspace_access.groups] Found 0 WorkspaceGroup
13:22 INFO [databricks.labs.ucx.workspace_access.groups] Listing account groups with id,displayName,externalId...
13:22 INFO [databricks.labs.ucx.workspace_access.groups] Found 68 account groups
13:22 INFO [databricks.labs.ucx.workspace_access.groups] No group listing provided, all matching groups will be migrated
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.groups] found 0 new records for groups
13:22 DEBUG [databricks.labs.ucx.hive_metastore.grants] Migrating acls on dummy_chauy.dummy_szbtt.dummy_t5gxs using SQL query: GRANT ALL PRIVILEGES ON TABLE `dummy_chauy`.`dummy_szbtt`.`dummy_t5gxs` TO `[email protected]`
13:22 DEBUG [databricks.labs.ucx.hive_metastore.grants] Migrating acls on dummy_chauy.dummy_szbtt.dummy_t5gxs using SQL query: ALTER TABLE `dummy_chauy`.`dummy_szbtt`.`dummy_t5gxs` OWNER TO `0a330eb5-dd51-4d97-b6e4-c474356b1d5d`
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] fetching migration_status inventory
13:22 WARNING [databricks.labs.ucx.hive_metastore.table_migration_status] Catalog ucx_fw8sfavbuwph5ql0 no longer exists. Skipping checking its migration status.
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.tables] fetching tables inventory
13:22 INFO [databricks.labs.ucx.hive_metastore.mapping] The intended target for hive_metastore.dummy_szbtt.dummy_t5gxs, dummy_chauy.dummy_szbtt.dummy_t5gxs, already exists.
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] ignoring any existing migration_status inventory; refresh is forced.
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] crawling new set of snapshot data for migration_status
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.tables] fetching tables inventory
13:22 WARNING [databricks.labs.ucx.hive_metastore.table_migration_status] Catalog ucx_pjouuayssckkdaco no longer exists. Skipping checking its migration status.
13:22 WARNING [databricks.labs.ucx.hive_metastore.table_migration_status] Catalog ucx_tqh7hggpniarplcj no longer exists. Skipping checking its migration status.
13:22 INFO [databricks.labs.ucx.hive_metastore.table_migration_status] dummy_szbtt.dummy_t5gxs is set as migrated
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] found 4 new records for migration_status
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] fetching migration_status inventory
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] fetching migration_status inventory
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] fetching migration_status inventory
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] fetching migration_status inventory
13:22 DEBUG [databricks.labs.ucx.hive_metastore.table_migrate] Migrating view hive_metastore.dummy_szbtt.view3 to using SQL query: CREATE VIEW IF NOT EXISTS `dummy_chauy`.`dummy_szbtt`.`view3` (`col1`, `col2`) WITH SCHEMA COMPENSATION TBLPROPERTIES ('transient_lastDdlTime'='1731071840') AS SELECT * FROM `dummy_chauy`.`dummy_szbtt`.`dummy_t5gxs`
13:22 DEBUG [databricks.labs.ucx.hive_metastore.table_migrate] Migrating view hive_metastore.dummy_szbtt.dummy_tnsdt to using SQL query: CREATE VIEW IF NOT EXISTS `dummy_chauy`.`dummy_szbtt`.`dummy_tnsdt` (`id`, `value`) WITH SCHEMA COMPENSATION TBLPROPERTIES ('RemoveAfter'='2024110815', 'transient_lastDdlTime'='1731071838') AS SELECT * FROM `dummy_chauy`.`dummy_szbtt`.`dummy_t5gxs`
13:22 WARNING [databricks.labs.ucx.hive_metastore.table_migrate] Failed to migrate view hive_metastore.dummy_szbtt.view3 to dummy_chauy.dummy_szbtt.view3: [UC_COMMAND_NOT_SUPPORTED.WITH_RECOMMENDATION] The command(s): 3-layer (catalogName.schemaName.tableName) table name notation: dummy_chauy.dummy_szbtt.dummy_t5gxs for view creation in HMS federation are not supported in Unity Catalog. Please use schemaName.tableName or tableName instead. SQLSTATE: 0AKUC
13:22 WARNING [databricks.labs.ucx.hive_metastore.table_migrate] Failed to migrate view hive_metastore.dummy_szbtt.dummy_tnsdt to dummy_chauy.dummy_szbtt.dummy_tnsdt: [UC_COMMAND_NOT_SUPPORTED.WITH_RECOMMENDATION] The command(s): 3-layer (catalogName.schemaName.tableName) table name notation: dummy_chauy.dummy_szbtt.dummy_t5gxs for view creation in HMS federation are not supported in Unity Catalog. Please use schemaName.tableName or tableName instead. SQLSTATE: 0AKUC
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] ignoring any existing migration_status inventory; refresh is forced.
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] crawling new set of snapshot data for migration_status
13:22 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.tables] fetching tables inventory
13:23 INFO [databricks.labs.ucx.hive_metastore.table_migration_status] dummy_szbtt.dummy_t5gxs is set as migrated
13:23 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] found 4 new records for migration_status
13:23 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] fetching migration_status inventory
13:23 INFO [databricks.labs.ucx.hive_metastore.table_migrate] View hive_metastore.dummy_szbtt.dummy_t8t7e cannot be migrated because hive_metastore.dummy_szbtt.dummy_tnsdt is not migrated yet
13:23 INFO [databricks.labs.ucx.hive_metastore.table_migrate] View hive_metastore.dummy_szbtt.dummy_t8t7e is not supported for migration
13:23 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] ignoring any existing migration_status inventory; refresh is forced.
13:23 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] crawling new set of snapshot data for migration_status
13:23 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.tables] fetching tables inventory
13:23 WARNING [databricks.labs.ucx.hive_metastore.table_migration_status] Catalog ucx_f8fo1dhvbv42i5wj no longer exists. Skipping checking its migration status.
13:23 WARNING [databricks.labs.ucx.hive_metastore.table_migration_status] Catalog ucx_spaxsnhwacdqhz7z no longer exists. Skipping checking its migration status.
13:23 INFO [databricks.labs.ucx.hive_metastore.table_migration_status] dummy_szbtt.dummy_t5gxs is set as migrated
13:23 DEBUG [databricks.labs.ucx.framework.crawlers] [hive_metastore.dummy_sv3c6.migration_status] found 4 new records for migration_status
[gw9] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_migration_job_ext_hms[regular]: AssertionError: dummy_topjf not found in dummy_canv5.migrate_bsxzb (25m37.359s)
... (skipped 1838 bytes)
x.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:23 INFO [databricks.labs.ucx.install] Creating dashboards...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.DZAy/README for the next steps.
13:23 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/211221110553184
13:23 INFO [databricks.labs.ucx.installer.workflows] Started migrate-tables job: https://DATABRICKS_HOST#job/211221110553184/runs/172501591397302
13:23 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-tables job: https://DATABRICKS_HOST#job/211221110553184/runs/172501591397302
13:28 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 172501591397302 with state: RunResultState.SUCCESS
13:28 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 172501591397302 duration: 0:05:29.759000 (2024-11-08 13:23:17.605000+00:00 thru 2024-11-08 13:28:47.364000+00:00)
13:28 DEBUG [databricks.labs.ucx.installer.workflows] Validating migrate-tables workflow: https://DATABRICKS_HOST#job/211221110553184
13:13 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.DZAy/config.yml) doesn't exist.
13:13 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
13:13 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
13:13 INFO [databricks.labs.ucx.install] Fetching installations...
13:13 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.DZAy is corrupted. Skipping...
13:13 INFO [databricks.labs.ucx.installer.policy] Setting up an external metastore
13:13 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
13:13 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:22 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:22 INFO [databricks.labs.ucx.install] Installing UCX v0.48.1+1920241108132210
13:22 INFO [databricks.labs.ucx.install] Creating ucx schemas...
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:23 INFO [databricks.labs.ucx.install] Creating dashboards...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:23 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:23 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:23 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.DZAy/README for the next steps.
13:23 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/211221110553184
13:23 INFO [databricks.labs.ucx.installer.workflows] Started migrate-tables job: https://DATABRICKS_HOST#job/211221110553184/runs/172501591397302
13:23 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-tables job: https://DATABRICKS_HOST#job/211221110553184/runs/172501591397302
13:28 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 172501591397302 with state: RunResultState.SUCCESS
13:28 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 172501591397302 duration: 0:05:29.759000 (2024-11-08 13:23:17.605000+00:00 thru 2024-11-08 13:28:47.364000+00:00)
13:28 DEBUG [databricks.labs.ucx.installer.workflows] Validating migrate-tables workflow: https://DATABRICKS_HOST#job/211221110553184
13:28 INFO [databricks.labs.ucx.install] Deleting UCX v0.48.1+1920241108132210 from https://DATABRICKS_HOST
13:28 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_s3fv7
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=801312191051484, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=117113033287939, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=959167858909248, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1074375249892442, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=416174638593876, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=187128995947238, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=211221110553184, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=544437049992091, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=317066122674191, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=911276068015018, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=381715240125396, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=307218230120422, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=796384119564505, as it is no longer needed
13:29 INFO [databricks.labs.ucx.install] Deleting cluster policy
13:29 INFO [databricks.labs.ucx.install] Deleting secret scope
13:29 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw3] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_table_migration_job_refreshes_migration_status[hiveserde-migrate-external-hiveserde-tables-in-place-experimental]: AssertionError: No migration statuses found (7m42.207s)
... (skipped 2327 bytes)
atabricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.install] Creating dashboards...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.bvrZ/README for the next steps.
13:22 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-external-hiveserde-tables-in-place-experimental job: https://DATABRICKS_HOST#job/222596905065116
13:22 INFO [databricks.labs.ucx.installer.workflows] Started migrate-external-hiveserde-tables-in-place-experimental job: https://DATABRICKS_HOST#job/222596905065116/runs/792524115864136
13:22 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-external-hiveserde-tables-in-place-experimental job: https://DATABRICKS_HOST#job/222596905065116/runs/792524115864136
13:29 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-external-hiveserde-tables-in-place-experimental job run 792524115864136 with state: RunResultState.SUCCESS_WITH_FAILURES (The job run succeeded with 3 failed tasks)
13:29 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-external-hiveserde-tables-in-place-experimental job run 792524115864136 duration: 0:06:37.437000 (2024-11-08 13:22:44.544000+00:00 thru 2024-11-08 13:29:21.981000+00:00)
13:29 WARNING [databricks.labs.ucx.installer.workflows] Cannot fetch logs as folder /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.bvrZ/logs/migrate-external-hiveserde-tables-in-place-experimental does not exist
13:22 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.bvrZ/config.yml) doesn't exist.
13:22 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
13:22 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
13:22 INFO [databricks.labs.ucx.install] Fetching installations...
13:22 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.bvrZ is corrupted. Skipping...
13:22 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
13:22 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:22 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:22 INFO [databricks.labs.ucx.install] Installing UCX v0.48.1+1920241108132231
13:22 INFO [databricks.labs.ucx.install] Creating ucx schemas...
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.install] Creating dashboards...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.bvrZ/README for the next steps.
13:22 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-external-hiveserde-tables-in-place-experimental job: https://DATABRICKS_HOST#job/222596905065116
13:22 INFO [databricks.labs.ucx.installer.workflows] Started migrate-external-hiveserde-tables-in-place-experimental job: https://DATABRICKS_HOST#job/222596905065116/runs/792524115864136
13:22 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-external-hiveserde-tables-in-place-experimental job: https://DATABRICKS_HOST#job/222596905065116/runs/792524115864136
13:29 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-external-hiveserde-tables-in-place-experimental job run 792524115864136 with state: RunResultState.SUCCESS_WITH_FAILURES (The job run succeeded with 3 failed tasks)
13:29 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-external-hiveserde-tables-in-place-experimental job run 792524115864136 duration: 0:06:37.437000 (2024-11-08 13:22:44.544000+00:00 thru 2024-11-08 13:29:21.981000+00:00)
13:29 WARNING [databricks.labs.ucx.installer.workflows] Cannot fetch logs as folder /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.bvrZ/logs/migrate-external-hiveserde-tables-in-place-experimental does not exist
13:29 INFO [databricks.labs.ucx.install] Deleting UCX v0.48.1+1920241108132231 from https://DATABRICKS_HOST
13:29 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_sa6em
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=682123374170138, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1054730076405214, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=222596905065116, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=407548670415262, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=772691803816377, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=49903497300569, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=151680679450556, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=445016114881582, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1016381850509707, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=823206044984596, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=530786404359244, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=756557607953935, as it is no longer needed
13:29 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=514035700813661, as it is no longer needed
13:29 INFO [databricks.labs.ucx.install] Deleting cluster policy
13:29 INFO [databricks.labs.ucx.install] Deleting secret scope
13:29 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw4] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_table_migration_job_refreshes_migration_status[regular-migrate-tables]: AssertionError: No destination schema found for TableType.VIEW hive_metastore.migrate_imgif.dummy_tw1qg (23m12.348s)
... (skipped 2533 bytes)
rimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
13:22 INFO [databricks.labs.ucx.install] Creating dashboards...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.d7vC/README for the next steps.
13:22 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/35501409354368
13:22 INFO [databricks.labs.ucx.installer.workflows] Started migrate-tables job: https://DATABRICKS_HOST#job/35501409354368/runs/98465796824152
13:22 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-tables job: https://DATABRICKS_HOST#job/35501409354368/runs/98465796824152
13:35 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 98465796824152 with state: RunResultState.SUCCESS
13:35 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 98465796824152 duration: 0:13:13.820000 (2024-11-08 13:22:19.999000+00:00 thru 2024-11-08 13:35:33.819000+00:00)
13:13 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.d7vC/config.yml) doesn't exist.
13:13 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
13:13 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
13:13 INFO [databricks.labs.ucx.install] Fetching installations...
13:13 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.d7vC is corrupted. Skipping...
13:13 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
13:13 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:22 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:22 INFO [databricks.labs.ucx.install] Installing UCX v0.48.1+1920241108132204
13:22 INFO [databricks.labs.ucx.install] Creating ucx schemas...
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
13:22 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
13:22 INFO [databricks.labs.ucx.install] Creating dashboards...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:22 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:22 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:22 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.d7vC/README for the next steps.
13:22 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/35501409354368
13:22 INFO [databricks.labs.ucx.installer.workflows] Started migrate-tables job: https://DATABRICKS_HOST#job/35501409354368/runs/98465796824152
13:22 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-tables job: https://DATABRICKS_HOST#job/35501409354368/runs/98465796824152
13:35 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 98465796824152 with state: RunResultState.SUCCESS
13:35 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 98465796824152 duration: 0:13:13.820000 (2024-11-08 13:22:19.999000+00:00 thru 2024-11-08 13:35:33.819000+00:00)
13:35 INFO [databricks.labs.ucx.install] Deleting UCX v0.48.1+1920241108132204 from https://DATABRICKS_HOST
13:35 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_snvd9
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=346897317277471, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=75960285687050, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=326213188180183, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=35501409354368, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=402984565013794, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1120185702154767, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=203559948824342, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=319661729089670, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=918695726339321, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=552615609659988, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=485246248900308, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1060014146388709, as it is no longer needed
13:35 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=79541570451066, as it is no longer needed
13:35 INFO [databricks.labs.ucx.install] Deleting cluster policy
13:35 INFO [databricks.labs.ucx.install] Deleting secret scope
13:35 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw6] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python
❌ test_table_migration_for_managed_table[managed-migrate-tables]: AssertionError: dummy_txrga not found in dummy_cfhsk.managed_ebtwt (13m43.549s)
... (skipped 1404 bytes)
tabricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
13:36 INFO [databricks.labs.ucx.install] Creating dashboards...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xaBM/README for the next steps.
13:36 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/278779220584403
13:36 INFO [databricks.labs.ucx.installer.workflows] Started migrate-tables job: https://DATABRICKS_HOST#job/278779220584403/runs/956519261176192
13:36 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-tables job: https://DATABRICKS_HOST#job/278779220584403/runs/956519261176192
13:49 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 956519261176192 with state: RunResultState.SUCCESS
13:49 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 956519261176192 duration: 0:12:07.103000 (2024-11-08 13:36:53.883000+00:00 thru 2024-11-08 13:49:00.986000+00:00)
13:36 DEBUG [databricks.labs.ucx.install] Cannot find previous installation: Path (/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xaBM/config.yml) doesn't exist.
13:36 INFO [databricks.labs.ucx.install] Please answer a couple of questions to configure Unity Catalog migration
13:36 INFO [databricks.labs.ucx.installer.hms_lineage] HMS Lineage feature creates one system table named system.hms_to_uc_migration.table_access and helps in your migration process from HMS to UC by allowing you to programmatically query HMS lineage data.
13:36 INFO [databricks.labs.ucx.install] Fetching installations...
13:36 WARNING [databricks.labs.ucx.install] Existing installation at /Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xaBM is corrupted. Skipping...
13:36 INFO [databricks.labs.ucx.installer.policy] Creating UCX cluster policy.
13:36 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:36 DEBUG [tests.integration.conftest] Waiting for clusters to start...
13:36 INFO [databricks.labs.ucx.install] Installing UCX v0.48.1+1920241108133639
13:36 INFO [databricks.labs.ucx.install] Creating ucx schemas...
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=validate-groups-permissions
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migration-progress-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-tables-ctas
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-tables-in-mounts-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=remove-workspace-local-backup-groups
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-data-reconciliation
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=scan-tables-in-mounts-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-groups
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=failing
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=migrate-external-hiveserde-tables-in-place-experimental
13:36 INFO [databricks.labs.ucx.installer.workflows] Creating new job configuration for step=assessment
13:36 INFO [databricks.labs.ucx.install] Creating dashboards...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/views...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration...
13:36 DEBUG [databricks.labs.ucx.install] Reading step folder /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/interactive...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/estimates...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/main...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/assessment/CLOUD_ENV...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/groups...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/migration/main...
13:36 INFO [databricks.labs.ucx.install] Creating dashboard in /home/runner/work/ucx/ucx/src/databricks/labs/ucx/queries/progress/main...
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.installer.mixins] Fetching warehouse_id from a config
13:36 INFO [databricks.labs.ucx.install] Installation completed successfully! Please refer to the https://DATABRICKS_HOST/#workspace/Users/0a330eb5-dd51-4d97-b6e4-c474356b1d5d/.xaBM/README for the next steps.
13:36 DEBUG [databricks.labs.ucx.installer.workflows] starting migrate-tables job: https://DATABRICKS_HOST#job/278779220584403
13:36 INFO [databricks.labs.ucx.installer.workflows] Started migrate-tables job: https://DATABRICKS_HOST#job/278779220584403/runs/956519261176192
13:36 DEBUG [databricks.labs.ucx.installer.workflows] Waiting for completion of migrate-tables job: https://DATABRICKS_HOST#job/278779220584403/runs/956519261176192
13:49 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 956519261176192 with state: RunResultState.SUCCESS
13:49 INFO [databricks.labs.ucx.installer.workflows] Completed migrate-tables job run 956519261176192 duration: 0:12:07.103000 (2024-11-08 13:36:53.883000+00:00 thru 2024-11-08 13:49:00.986000+00:00)
13:49 INFO [databricks.labs.ucx.install] Deleting UCX v0.48.1+1920241108133639 from https://DATABRICKS_HOST
13:49 INFO [databricks.labs.ucx.install] Deleting inventory database dummy_spmc1
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=605744596295624, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=941777752109675, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=685740857369418, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=278779220584403, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=1052580989422769, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=299050558069464, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=693437314966174, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=554982382192287, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=387297350626687, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=680970212559056, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=449385012374913, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=328825079136241, as it is no longer needed
13:49 INFO [databricks.labs.ucx.installer.workflows] Removing job_id=233833857260584, as it is no longer needed
13:49 INFO [databricks.labs.ucx.install] Deleting cluster policy
13:49 INFO [databricks.labs.ucx.install] Deleting secret scope
13:49 INFO [databricks.labs.ucx.install] UnInstalling UCX complete
[gw1] linux -- Python 3.10.15 /home/runner/work/ucx/ucx/.venv/bin/python

Flaky tests:

  • 🤪 test_create_catalog_schema_when_users_group_in_warehouse_acl (1m15.34s)
  • 🤪 test_hiveserde_table_ctas_migration_job[hiveserde] (12m35.139s)

Running from acceptance #7257

@nfx nfx mentioned this pull request Nov 8, 2024
nfx added a commit that referenced this pull request Nov 8, 2024
* Added `MigrationSequencer` for jobs ([#3008](#3008)). In this commit, a `MigrationSequencer` class has been added to manage the migration sequence for various resources including jobs, job tasks, job task dependencies, job clusters, and clusters. The class builds a graph of dependencies and analyzes it to generate the migration sequence, which is returned as an iterable of `MigrationStep` objects. These objects contain information about the object type, ID, name, owner, required step IDs, and step number. The commit also includes new unit and integration tests to ensure the functionality is working correctly. The migration sequence is used in tests for assessing the sequencing feature, and it handles tasks that reference existing or non-existing clusters or job clusters, and new cluster definitions. This change is linked to issue [#1415](#1415) and supersedes issue [#2980](#2980). Additionally, the commit removes some unnecessary imports and fixtures from a test file.
* Added `phik` to known list ([#3198](#3198)). In this release, we have added `phik` to the known list in the provided JSON file. This change addresses part of issue [#1931](#1931), as outlined in the linked issues. The `phik` key has been added with an empty list as its value, consistent with the structure of other keys in the JSON file. It is important to note that no existing functionality has been altered and no new methods have been introduced in this commit. The scope of the change is confined to updating the known list in the JSON file by adding the `phik` key.
* Added `pmdarima` to known list ([#3199](#3199)). In this release, we are excited to announce the addition of support for the `pmdarima` library, an open-source Python library for automatic seasonal decomposition of time series. With this commit, we have added `pmdarima` to our known list of libraries, providing our users with access to its various methods and functions for data preprocessing, model selection, and visualization. The library is particularly useful for fitting ARIMA models and testing for seasonality. By integrating `pmdarima`, users can now perform time series analysis and forecasting with greater ease and efficiency. This change partly resolves issue [#1931](#1931) and underscores our commitment to providing our users with access to the latest and most innovative open-source libraries available.
* Added `preshed` to known list ([#3220](#3220)). A new library, "preshed," has been added to our project's supported libraries, enhancing compatibility and enabling efficient utilization of its capabilities. Developed using Cython, `preshed` is a Python interface to Intel(R) MKL's sparse BLAS, sparse solvers, and sparse linear algebra routines. With the inclusion of two modules, `preshed` and "preshed.about," this addition partially resolves issue [#1931](#1931), improving the project's overall performance and reliability in sparse linear algebra tasks. Software engineers can now leverage the `preshed` library's features and optimized routines for their projects, reducing development time and increasing efficiency.
* Added `py-cpuinfo` to known list ([#3221](#3221)). In this release, we have added support for the `py-cpuinfo` library to our project, enabling the use of the `cpuinfo` functionality that it provides. With this addition, developers can now access detailed information about the CPU, such as the number of cores, current frequency, and vendor, which can be useful for performance tuning and optimization. This change partially resolves issue [#1931](#1931) and does not affect any existing functionality or add new methods to the codebase. We believe that this improvement will enhance the capabilities of our project and enable more efficient use of CPU resources.
* Cater for empty python cells ([#3212](#3212)). In this release, we have resolved an issue where certain notebook cells in the dependency builder were causing crashes. Specifically, empty or comment-only cells were identified as the source of the problem. To address this, we have implemented a check to account for these cases, ensuring that an empty tree is stored in the `_python_trees` dictionary if the input cell does not produce a valid tree. This change helps prevent crashes in the dependency builder caused by empty or comment-only cells. Furthermore, we have added a test to verify the fix on a failed repository. If a cell does not produce a tree, the `_load_children_from_tree` method will not be executed for that cell, skipping the loading of any children trees. This enhancement improves the overall stability and reliability of the library by preventing crashes caused by invalid input.
* Create `TODO` issues every nightly run ([#3196](#3196)). A commit has been made to update the `acceptance` repository version in the `acceptance.yml` GitHub workflow from `acceptance/v0.4.0` to `acceptance/v0.4.2`, which affects the integration tests. The `Run nightly tests` step in the GitHub repository's workflow has also been updated to use a newer version of the `databrickslabs/sandbox/acceptance` action, from `v0.3.1` to `v0.4.2`. Software engineers should verify that the new version of the `acceptance` repository contains all necessary updates and fixes, and that the integration tests continue to function as expected. Additionally, testing the updated action is important to ensure that the nightly tests run successfully with up-to-date code and can catch potential issues.
* Fixed Integration test failure of migration_tables ([#3108](#3108)). This release includes a fix for two integration tests (`test_migrate_managed_table_to_external_table_without_conversion` and `test_migrate_managed_table_to_external_table_with_clone`) related to Hive Metastore table migration, addressing issues [#3054](#3054) and [#3055](#3055). Previously skipped due to underlying problems, these tests have now been unskipped, enhancing the migration feature's test coverage. No changes have been made to the existing functionality, as the focus is solely on including the previously skipped tests in the testing suite. The changes involve removing `@pytest.mark.skip` markers from the test functions, ensuring they run and provide a more comprehensive test coverage for the Hive Metastore migration feature. In addition, this release includes an update to DirectFsAccess integration tests, addressing issues related to the removal of DFSA collectors and ensuring proper handling of different file types, with no modifications made to other parts of the codebase.
* Replace MockInstallation with MockPathLookup for testing fixtures ([#3215](#3215)). In this release, we have updated the testing fixtures in our unit tests by replacing the MockInstallation class with MockPathLookup. Specifically, we have modified the _load_sources function to use MockPathLookup instead of MockInstallation for loading sources. This change not only enhances the testing capabilities of the module but also introduces a new logger, logger, for more precise logging within the module. Additionally, we have updated the _load_sources function calls in the test_notebook.py file to pass the file path directly instead of a SourceContainer object. This modification allows for more flexible and straightforward testing of file-related functionality, thereby fixing issue [#3115](#3115).
* Updated sqlglot requirement from <25.29,>=25.5.0 to >=25.5.0,<25.30 ([#3224](#3224)). The open-source library `sqlglot` has been updated to version 25.29.0 with this release, incorporating several breaking changes, new features, and bug fixes. The breaking changes include transpiling `ANY` to `EXISTS`, supporting the `MEDIAN()` function, wrapping values in `NOT value IS ...`, and parsing information schema views into a single identifier. New features include support for the `JSONB_EXISTS` function in PostgreSQL, transpiling `ANY` to `EXISTS` in Spark, transpiling Snowflake's `TIMESTAMP()` function, and adding support for hexadecimal literals in Teradata. Bug fixes include handling a Move edge case in the semantic differ, adding a `NULL` filter on `ARRAY_AGG` only for columns, improving parsing of `WITH FILL ... INTERPOLATE` in Clickhouse, generating `LOG(...)` for `exp.Ln` in TSQL, and optionally parsing a Stream expression. The full changelog can be found in the pull request, which also includes a list of the commits included in this release.
* Use acceptance/v0.4.0 ([#3192](#3192)). A change has been made to the GitHub Actions workflow file for acceptance tests, updating the version of the `databrickslabs/sandbox/acceptance` runner to `acceptance/v0.4.0` and granting write permissions for the `issues` field in the `permissions` section. These updates will allow for the use of the latest version of the acceptance tests and provide the necessary permissions to interact with issues. A `TODO` comment has been added to indicate that the new version of the acceptance tests needs to be updated elsewhere in the codebase. This change will ensure that the acceptance tests are up-to-date and functioning properly.
* Warn about errors instead to avoid job task failure ([#3219](#3219)). In this change, the `refresh_report` method in `jobs.py` has been updated to log warnings instead of raising errors when certain problems are encountered during its execution. Previously, if there were any errors during the linting process, a `ManyError` exception was raised, causing the job task to fail. Now, errors are logged as warnings, allowing the job task to continue running successfully. This resolves issue [#3214](#3214) and ensures that the job task will not fail due to linting errors, allowing users to be aware of any issues that occurred during the linting process while still completing the job task successfully. The updated method checks for errors during the linting process, adds them to a list, and constructs a string of error messages if there are any. This string of error messages is then logged as a warning using the `logger.warning` function, allowing the method to continue executing and the job task to complete successfully.
* [DOC] Add dashboard section ([#3222](#3222)). In this release, we have added a new dashboard section to the project documentation, which provides visualizations of UCX's outcomes to help users better understand and manage their UCX environment. The new section includes a table listing the available dashboards, including the Azure service principals dashboard. This dashboard displays information about Azure service principals discovered by UCX in configurations from various sources such as clusters, cluster policies, job clusters, pipelines, and warehouses. Each dashboard has text widgets that offer detailed information about the contents and are designed to help users understand UCX's results and progress in a more visual and interactive way. The Azure service principals dashboard specifically offers users valuable insights into their Azure service principals within the UCX environment.
* [DOC] README.md rewrite ([#3211](#3211)). The Databricks Labs UCX package offers a suite of tools for migrating data objects from the Hive metastore to Unity Catalog (UC), encompassing a comprehensive table migration process. This process consists of table mapping, data access setup, creating new UC resources, and migrating Hive metastore data objects. Table mapping is achieved using a table mapping file that defaults to mapping all tables/views to UC tables while preserving the original schema and names, but can be customized as needed. Data access setup involves creating and modifying cloud principals and credentials for UC data. New UC resources are created without affecting existing Hive metastore resources, and users can choose from various strategies for migrating tables based on their format and location. Additionally, the package provides installation resources, including a README notebook, a DEBUG notebook, debug logs, and installation configuration, as well as utility commands for viewing and repairing workflows. The migration process also includes an assessment workflow, group migration workflow, data reconciliation, and code migration commands.
* [chore] Added tests to verify linter not being stuck in the infinite loop ([#3225](#3225)). In this release, we have added new functional tests to ensure that the linter does not get stuck in an infinite loop, addressing a bug that was fixed in version 0.46.0 related to the default format change from Parquet to Delta in Databricks Runtime 8.0 and a SQL parse error. These tests involve creating data frames, writing them to tables, and reading from those tables, using PySpark's SQL functions and a system information schema table to demonstrate the corrected behavior. The tests also include SQL queries that select columns from a system information schema table with a specified limit, using a withColumn() method to add a new column to a data frame based on a condition. These new tests provide assurance that the linter will not get stuck in an infinite loop and that SQL queries with table parameters are supported.
* [internal] Temporarily disable integration tests due to ES-1302145 ([#3226](#3226)). In this release, the integration tests for moving tables, views, and aliasing tables have been temporarily disabled due to issue ES-1302145. The `test_move_tables`, `test_move_views`, and `test_alias_tables` functions were previously decorated with `@retried` to handle potential `NotFound` exceptions and had a timeout of 2 minutes, but are now marked with `@pytest.mark.skip("ES-1302145")`. Once the issue is resolved, the `@pytest.mark.skip` decorator should be removed to re-enable the tests. The remaining code in the file, including the `test_move_tables_no_from_schema`, `test_move_tables_no_to_schema`, and `test_move_views_no_from_schema` functions, is unchanged and still functional.
* use a path instance for MISSING_SOURCE_PATH and add test ([#3217](#3217)). In this release, the handling of MISSING_SOURCE_PATH has been improved by replacing the string representation with a Path instance using Pathlib, which simplifies checks for missing source paths and enables the addition of a new test for the DependencyProblem class. This test verifies the behavior of the newly introduced method, is_path_missing(), in the DependencyProblem class for determining if a given problem is caused by a missing path. Co-authored by Eric Vergnaud, these changes not only improve the handling and testing of missing paths but also contribute to enhancing the source code analysis functionality of the databricks/labs/ucx project.

Dependency updates:

 * Updated sqlglot requirement from <25.29,>=25.5.0 to >=25.5.0,<25.30 ([#3224](#3224)).
nfx added a commit that referenced this pull request Nov 8, 2024
* Added `MigrationSequencer` for jobs
([#3008](#3008)). In this
commit, a `MigrationSequencer` class has been added to manage the
migration sequence for various resources including jobs, job tasks, job
task dependencies, job clusters, and clusters. The class builds a graph
of dependencies and analyzes it to generate the migration sequence,
which is returned as an iterable of `MigrationStep` objects. These
objects contain information about the object type, ID, name, owner,
required step IDs, and step number. The commit also includes new unit
and integration tests to ensure the functionality is working correctly.
The migration sequence is used in tests for assessing the sequencing
feature, and it handles tasks that reference existing or non-existing
clusters or job clusters, and new cluster definitions. This change is
linked to issue
[#1415](#1415) and
supersedes issue
[#2980](#2980).
Additionally, the commit removes some unnecessary imports and fixtures
from a test file.
* Added `phik` to known list
([#3198](#3198)). In this
release, we have added `phik` to the known list in the provided JSON
file. This change addresses part of issue
[#1931](#1931), as outlined
in the linked issues. The `phik` key has been added with an empty list
as its value, consistent with the structure of other keys in the JSON
file. It is important to note that no existing functionality has been
altered and no new methods have been introduced in this commit. The
scope of the change is confined to updating the known list in the JSON
file by adding the `phik` key.
* Added `pmdarima` to known list
([#3199](#3199)). In this
release, we are excited to announce the addition of support for the
`pmdarima` library, an open-source Python library for automatic seasonal
decomposition of time series. With this commit, we have added `pmdarima`
to our known list of libraries, providing our users with access to its
various methods and functions for data preprocessing, model selection,
and visualization. The library is particularly useful for fitting ARIMA
models and testing for seasonality. By integrating `pmdarima`, users can
now perform time series analysis and forecasting with greater ease and
efficiency. This change partly resolves issue
[#1931](#1931) and
underscores our commitment to providing our users with access to the
latest and most innovative open-source libraries available.
* Added `preshed` to known list
([#3220](#3220)). A new
library, "preshed," has been added to our project's supported libraries,
enhancing compatibility and enabling efficient utilization of its
capabilities. Developed using Cython, `preshed` is a Python interface to
Intel(R) MKL's sparse BLAS, sparse solvers, and sparse linear algebra
routines. With the inclusion of two modules, `preshed` and
"preshed.about," this addition partially resolves issue
[#1931](#1931), improving
the project's overall performance and reliability in sparse linear
algebra tasks. Software engineers can now leverage the `preshed`
library's features and optimized routines for their projects, reducing
development time and increasing efficiency.
* Added `py-cpuinfo` to known list
([#3221](#3221)). In this
release, we have added support for the `py-cpuinfo` library to our
project, enabling the use of the `cpuinfo` functionality that it
provides. With this addition, developers can now access detailed
information about the CPU, such as the number of cores, current
frequency, and vendor, which can be useful for performance tuning and
optimization. This change partially resolves issue
[#1931](#1931) and does not
affect any existing functionality or add new methods to the codebase. We
believe that this improvement will enhance the capabilities of our
project and enable more efficient use of CPU resources.
* Cater for empty python cells
([#3212](#3212)). In this
release, we have resolved an issue where certain notebook cells in the
dependency builder were causing crashes. Specifically, empty or
comment-only cells were identified as the source of the problem. To
address this, we have implemented a check to account for these cases,
ensuring that an empty tree is stored in the `_python_trees` dictionary
if the input cell does not produce a valid tree. This change helps
prevent crashes in the dependency builder caused by empty or
comment-only cells. Furthermore, we have added a test to verify the fix
on a failed repository. If a cell does not produce a tree, the
`_load_children_from_tree` method will not be executed for that cell,
skipping the loading of any children trees. This enhancement improves
the overall stability and reliability of the library by preventing
crashes caused by invalid input.
* Create `TODO` issues every nightly run
([#3196](#3196)). A commit
has been made to update the `acceptance` repository version in the
`acceptance.yml` GitHub workflow from `acceptance/v0.4.0` to
`acceptance/v0.4.2`, which affects the integration tests. The `Run
nightly tests` step in the GitHub repository's workflow has also been
updated to use a newer version of the
`databrickslabs/sandbox/acceptance` action, from `v0.3.1` to `v0.4.2`.
Software engineers should verify that the new version of the
`acceptance` repository contains all necessary updates and fixes, and
that the integration tests continue to function as expected.
Additionally, testing the updated action is important to ensure that the
nightly tests run successfully with up-to-date code and can catch
potential issues.
* Fixed Integration test failure of migration_tables
([#3108](#3108)). This
release includes a fix for two integration tests
(`test_migrate_managed_table_to_external_table_without_conversion` and
`test_migrate_managed_table_to_external_table_with_clone`) related to
Hive Metastore table migration, addressing issues
[#3054](#3054) and
[#3055](#3055). Previously
skipped due to underlying problems, these tests have now been unskipped,
enhancing the migration feature's test coverage. No changes have been
made to the existing functionality, as the focus is solely on including
the previously skipped tests in the testing suite. The changes involve
removing `@pytest.mark.skip` markers from the test functions, ensuring
they run and provide a more comprehensive test coverage for the Hive
Metastore migration feature. In addition, this release includes an
update to DirectFsAccess integration tests, addressing issues related to
the removal of DFSA collectors and ensuring proper handling of different
file types, with no modifications made to other parts of the codebase.
* Replace MockInstallation with MockPathLookup for testing fixtures
([#3215](#3215)). In this
release, we have updated the testing fixtures in our unit tests by
replacing the MockInstallation class with MockPathLookup. Specifically,
we have modified the _load_sources function to use MockPathLookup
instead of MockInstallation for loading sources. This change not only
enhances the testing capabilities of the module but also introduces a
new logger, logger, for more precise logging within the module.
Additionally, we have updated the _load_sources function calls in the
test_notebook.py file to pass the file path directly instead of a
SourceContainer object. This modification allows for more flexible and
straightforward testing of file-related functionality, thereby fixing
issue [#3115](#3115).
* Updated sqlglot requirement from <25.29,>=25.5.0 to >=25.5.0,<25.30
([#3224](#3224)). The
open-source library `sqlglot` has been updated to version 25.29.0 with
this release, incorporating several breaking changes, new features, and
bug fixes. The breaking changes include transpiling `ANY` to `EXISTS`,
supporting the `MEDIAN()` function, wrapping values in `NOT value IS
...`, and parsing information schema views into a single identifier. New
features include support for the `JSONB_EXISTS` function in PostgreSQL,
transpiling `ANY` to `EXISTS` in Spark, transpiling Snowflake's
`TIMESTAMP()` function, and adding support for hexadecimal literals in
Teradata. Bug fixes include handling a Move edge case in the semantic
differ, adding a `NULL` filter on `ARRAY_AGG` only for columns,
improving parsing of `WITH FILL ... INTERPOLATE` in Clickhouse,
generating `LOG(...)` for `exp.Ln` in TSQL, and optionally parsing a
Stream expression. The full changelog can be found in the pull request,
which also includes a list of the commits included in this release.
* Use acceptance/v0.4.0
([#3192](#3192)). A change
has been made to the GitHub Actions workflow file for acceptance tests,
updating the version of the `databrickslabs/sandbox/acceptance` runner
to `acceptance/v0.4.0` and granting write permissions for the `issues`
field in the `permissions` section. These updates will allow for the use
of the latest version of the acceptance tests and provide the necessary
permissions to interact with issues. A `TODO` comment has been added to
indicate that the new version of the acceptance tests needs to be
updated elsewhere in the codebase. This change will ensure that the
acceptance tests are up-to-date and functioning properly.
* Warn about errors instead to avoid job task failure
([#3219](#3219)). In this
change, the `refresh_report` method in `jobs.py` has been updated to log
warnings instead of raising errors when certain problems are encountered
during its execution. Previously, if there were any errors during the
linting process, a `ManyError` exception was raised, causing the job
task to fail. Now, errors are logged as warnings, allowing the job task
to continue running successfully. This resolves issue
[#3214](#3214) and ensures
that the job task will not fail due to linting errors, allowing users to
be aware of any issues that occurred during the linting process while
still completing the job task successfully. The updated method checks
for errors during the linting process, adds them to a list, and
constructs a string of error messages if there are any. This string of
error messages is then logged as a warning using the `logger.warning`
function, allowing the method to continue executing and the job task to
complete successfully.
* [DOC] Add dashboard section
([#3222](#3222)). In this
release, we have added a new dashboard section to the project
documentation, which provides visualizations of UCX's outcomes to help
users better understand and manage their UCX environment. The new
section includes a table listing the available dashboards, including the
Azure service principals dashboard. This dashboard displays information
about Azure service principals discovered by UCX in configurations from
various sources such as clusters, cluster policies, job clusters,
pipelines, and warehouses. Each dashboard has text widgets that offer
detailed information about the contents and are designed to help users
understand UCX's results and progress in a more visual and interactive
way. The Azure service principals dashboard specifically offers users
valuable insights into their Azure service principals within the UCX
environment.
* [DOC] README.md rewrite
([#3211](#3211)). The
Databricks Labs UCX package offers a suite of tools for migrating data
objects from the Hive metastore to Unity Catalog (UC), encompassing a
comprehensive table migration process. This process consists of table
mapping, data access setup, creating new UC resources, and migrating
Hive metastore data objects. Table mapping is achieved using a table
mapping file that defaults to mapping all tables/views to UC tables
while preserving the original schema and names, but can be customized as
needed. Data access setup involves creating and modifying cloud
principals and credentials for UC data. New UC resources are created
without affecting existing Hive metastore resources, and users can
choose from various strategies for migrating tables based on their
format and location. Additionally, the package provides installation
resources, including a README notebook, a DEBUG notebook, debug logs,
and installation configuration, as well as utility commands for viewing
and repairing workflows. The migration process also includes an
assessment workflow, group migration workflow, data reconciliation, and
code migration commands.
* [chore] Added tests to verify linter not being stuck in the infinite
loop ([#3225](#3225)). In
this release, we have added new functional tests to ensure that the
linter does not get stuck in an infinite loop, addressing a bug that was
fixed in version 0.46.0 related to the default format change from
Parquet to Delta in Databricks Runtime 8.0 and a SQL parse error. These
tests involve creating data frames, writing them to tables, and reading
from those tables, using PySpark's SQL functions and a system
information schema table to demonstrate the corrected behavior. The
tests also include SQL queries that select columns from a system
information schema table with a specified limit, using a withColumn()
method to add a new column to a data frame based on a condition. These
new tests provide assurance that the linter will not get stuck in an
infinite loop and that SQL queries with table parameters are supported.
* [internal] Temporarily disable integration tests due to ES-1302145
([#3226](#3226)). In this
release, the integration tests for moving tables, views, and aliasing
tables have been temporarily disabled due to issue ES-1302145. The
`test_move_tables`, `test_move_views`, and `test_alias_tables` functions
were previously decorated with `@retried` to handle potential `NotFound`
exceptions and had a timeout of 2 minutes, but are now marked with
`@pytest.mark.skip("ES-1302145")`. Once the issue is resolved, the
`@pytest.mark.skip` decorator should be removed to re-enable the tests.
The remaining code in the file, including the
`test_move_tables_no_from_schema`, `test_move_tables_no_to_schema`, and
`test_move_views_no_from_schema` functions, is unchanged and still
functional.
* use a path instance for MISSING_SOURCE_PATH and add test
([#3217](#3217)). In this
release, the handling of MISSING_SOURCE_PATH has been improved by
replacing the string representation with a Path instance using Pathlib,
which simplifies checks for missing source paths and enables the
addition of a new test for the DependencyProblem class. This test
verifies the behavior of the newly introduced method, is_path_missing(),
in the DependencyProblem class for determining if a given problem is
caused by a missing path. Co-authored by Eric Vergnaud, these changes
not only improve the handling and testing of missing paths but also
contribute to enhancing the source code analysis functionality of the
databricks/labs/ucx project.

Dependency updates:

* Updated sqlglot requirement from <25.29,>=25.5.0 to >=25.5.0,<25.30
([#3224](#3224)).
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Test failure: test_alias_tables Test failure: test_move_views Test failure: test_move_tables
1 participant