Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bmoric/update connection list with breaking #18125

Merged
merged 54 commits into from
Oct 24, 2022
Merged
Show file tree
Hide file tree
Changes from 53 commits
Commits
Show all changes
54 commits
Select commit Hold shift + click to select a range
abaea95
add schemaChange
alovew Oct 10, 2022
f428057
merge conflict
alovew Oct 11, 2022
5a3df67
frontend tests
alovew Oct 11, 2022
6b8e1f0
tests
alovew Oct 12, 2022
5ac2cde
l
alovew Oct 12, 2022
024b3d3
fix source catalog id
alovew Oct 13, 2022
4fcc5a6
test
alovew Oct 13, 2022
b8eeec2
formatting
alovew Oct 13, 2022
ce5b574
move schema change to build backend web connection
alovew Oct 13, 2022
b14b322
check if actor catalog id is different
alovew Oct 13, 2022
9593ebc
fix
alovew Oct 13, 2022
558022c
tests and fixes
alovew Oct 14, 2022
17a9681
remove extra var
alovew Oct 14, 2022
e03cdbb
remove logging
alovew Oct 14, 2022
2686be6
continue to pass back new catalog id
alovew Oct 14, 2022
62ca0f7
api updates
alovew Oct 17, 2022
b78fd6e
fix mockdata
alovew Oct 17, 2022
7941c76
tests
alovew Oct 17, 2022
c42af6b
add schemaChange
alovew Oct 10, 2022
c444399
merge conflict
alovew Oct 11, 2022
2359d9d
frontend tests
alovew Oct 11, 2022
50770ae
tests
alovew Oct 12, 2022
5bc3e3e
l
alovew Oct 12, 2022
6d13a42
fix source catalog id
alovew Oct 13, 2022
20f9e30
test
alovew Oct 13, 2022
21eb1d0
formatting
alovew Oct 13, 2022
32613c0
move schema change to build backend web connection
alovew Oct 13, 2022
5320f73
check if actor catalog id is different
alovew Oct 13, 2022
826cfb0
fix
alovew Oct 13, 2022
1270de2
tests and fixes
alovew Oct 14, 2022
f10e413
remove extra var
alovew Oct 14, 2022
1c9a008
remove logging
alovew Oct 14, 2022
5781ce0
continue to pass back new catalog id
alovew Oct 14, 2022
5203c9f
api updates
alovew Oct 17, 2022
f81285c
fix mockdata
alovew Oct 17, 2022
6c4d620
tests
alovew Oct 17, 2022
9e1db24
tests
alovew Oct 17, 2022
e16a35d
optional of nullable
alovew Oct 17, 2022
1e9cc8a
Tmp
benmoriceau Oct 17, 2022
f38b682
For diff
benmoriceau Oct 18, 2022
8f28aa5
Add test
benmoriceau Oct 18, 2022
acf9ec1
More test
benmoriceau Oct 18, 2022
fc085cf
Fix test and add some
benmoriceau Oct 19, 2022
95a6dc0
Merge branch 'anne/connection-get-endpoint' of github.com:airbytehq/a…
benmoriceau Oct 19, 2022
cdf1065
Fix merge and test
benmoriceau Oct 19, 2022
d1683d5
Fix PMD
benmoriceau Oct 19, 2022
6662bb7
Merge branch 'master' into anne/connection-get-endpoint
benmoriceau Oct 19, 2022
f93f67d
Merge branch 'anne/connection-get-endpoint' into bmoric/update-connec…
benmoriceau Oct 19, 2022
a2b00ca
Merge branch 'master' of github.com:airbytehq/airbyte into bmoric/upd…
benmoriceau Oct 21, 2022
53937da
Fix test
benmoriceau Oct 21, 2022
8c8cd4b
Rm dead code
benmoriceau Oct 21, 2022
0a0c500
Fix pmd
benmoriceau Oct 21, 2022
f4959c8
Address PR comments
benmoriceau Oct 24, 2022
68d6825
RM unused column
benmoriceau Oct 24, 2022
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions airbyte-api/src/main/openapi/config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -4649,6 +4649,7 @@ components:
- destination
- status
- isSyncing
- schemaChange
properties:
connectionId:
$ref: "#/components/schemas/ConnectionId"
Expand All @@ -4674,6 +4675,8 @@ components:
$ref: "#/components/schemas/JobStatus"
isSyncing:
type: boolean
schemaChange:
$ref: "#/components/schemas/SchemaChange"
WebBackendConnectionRead:
type: object
required:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -975,6 +975,22 @@ public Optional<ActorCatalogFetchEvent> getMostRecentActorCatalogFetchEventForSo
return records.stream().findFirst().map(DbConverter::buildActorCatalogFetchEvent);
}

public Map<UUID, ActorCatalogFetchEvent> getMostRecentActorCatalogFetchEventForSources(final List<UUID> sourceIds)
Copy link
Contributor

@cgardens cgardens Nov 20, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@benmoriceau what is this method supposed to do? the argument that is passed in is unused.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cgardens. It is doing a similar operation than getMostRecentActorCatalogFetchEventForSource but for a list of sourceIds instead of a single source Id. The use of the input got lost during PR updates

Fixed in #19668

throws IOException {

return database.query(ctx -> ctx.fetch(
"""
select actor_catalog_id, actor_id from
(select id, actor_catalog_id, actor_id, config_hash, actor_version, created_at, rank() over (partition by actor_id order by created_at desc) as creation_order_rank, modified_at
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

kind of nitpicky, but do we need to select all these fields or just actor_id, actor_catalog_id, & creation_order_rank?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes most of the column are not needed. I have remove them. Thanks for catching that.

from public.actor_catalog_fetch_event
) table_with_rank
where creation_order_rank = 1;
"""))
.stream().map(DbConverter::buildActorCatalogFetchEvent)
.collect(Collectors.toMap(record -> record.getActorId(),
record -> record));
}

/**
* Stores source catalog information.
*
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -198,6 +198,7 @@ public static ActorCatalog buildActorCatalog(final Record record) {

public static ActorCatalogFetchEvent buildActorCatalogFetchEvent(final Record record) {
return new ActorCatalogFetchEvent()
.withActorId(record.get(ACTOR_CATALOG_FETCH_EVENT.ACTOR_ID))
.withActorCatalogId(record.get(ACTOR_CATALOG_FETCH_EVENT.ACTOR_CATALOG_ID));
}

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -506,17 +506,17 @@ void testGetGeographyForConnection() throws IOException {
}

@Test
void testGetMostRecentActorCatalogFetchEventForSources() throws SQLException, IOException, JsonValidationException {
void testGetMostRecentActorCatalogFetchEventForSource() throws SQLException, IOException, JsonValidationException {
for (final ActorCatalog actorCatalog : MockData.actorCatalogs()) {
configPersistence.writeConfig(ConfigSchema.ACTOR_CATALOG, actorCatalog.getId().toString(), actorCatalog);
}

OffsetDateTime now = OffsetDateTime.now();
OffsetDateTime yesterday = now.minusDays(1l);
final OffsetDateTime now = OffsetDateTime.now();
final OffsetDateTime yesterday = now.minusDays(1l);

List<ActorCatalogFetchEvent> fetchEvents = MockData.actorCatalogFetchEventsSameSource();
ActorCatalogFetchEvent fetchEvent1 = fetchEvents.get(0);
ActorCatalogFetchEvent fetchEvent2 = fetchEvents.get(1);
final List<ActorCatalogFetchEvent> fetchEvents = MockData.actorCatalogFetchEventsSameSource();
final ActorCatalogFetchEvent fetchEvent1 = fetchEvents.get(0);
final ActorCatalogFetchEvent fetchEvent2 = fetchEvents.get(1);

database.transaction(ctx -> {
insertCatalogFetchEvent(
Expand All @@ -533,13 +533,37 @@ void testGetMostRecentActorCatalogFetchEventForSources() throws SQLException, IO
return null;
});

Optional<ActorCatalogFetchEvent> result =
final Optional<ActorCatalogFetchEvent> result =
configRepository.getMostRecentActorCatalogFetchEventForSource(fetchEvent1.getActorId());

assertEquals(fetchEvent2.getActorCatalogId(), result.get().getActorCatalogId());
}

private void insertCatalogFetchEvent(DSLContext ctx, UUID sourceId, UUID catalogId, OffsetDateTime creationDate) {
@Test
void testGetMostRecentActorCatalogFetchEventForSources() throws SQLException, IOException, JsonValidationException {
for (final ActorCatalog actorCatalog : MockData.actorCatalogs()) {
configPersistence.writeConfig(ConfigSchema.ACTOR_CATALOG, actorCatalog.getId().toString(), actorCatalog);
}

database.transaction(ctx -> {
MockData.actorCatalogFetchEventsForAggregationTest().forEach(actorCatalogFetchEvent -> insertCatalogFetchEvent(
ctx,
actorCatalogFetchEvent.getActorCatalogFetchEvent().getActorId(),
actorCatalogFetchEvent.getActorCatalogFetchEvent().getActorCatalogId(),
actorCatalogFetchEvent.getCreatedAt()));

return null;
});

final Map<UUID, ActorCatalogFetchEvent> result =
configRepository.getMostRecentActorCatalogFetchEventForSources(List.of(MockData.SOURCE_ID_1,
MockData.SOURCE_ID_2));

assertEquals(MockData.ACTOR_CATALOG_ID_1, result.get(MockData.SOURCE_ID_1).getActorCatalogId());
assertEquals(MockData.ACTOR_CATALOG_ID_3, result.get(MockData.SOURCE_ID_2).getActorCatalogId());
}

private void insertCatalogFetchEvent(final DSLContext ctx, final UUID sourceId, final UUID catalogId, final OffsetDateTime creationDate) {
ctx.insertInto(ACTOR_CATALOG_FETCH_EVENT)
.columns(
ACTOR_CATALOG_FETCH_EVENT.ID,
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -50,17 +50,19 @@
import io.airbyte.protocol.models.SyncMode;
import java.net.URI;
import java.time.Instant;
import java.time.OffsetDateTime;
import java.util.Arrays;
import java.util.Collections;
import java.util.List;
import java.util.Map;
import java.util.TreeMap;
import java.util.UUID;
import java.util.stream.Collectors;
import lombok.Data;

public class MockData {

private static final UUID WORKSPACE_ID_1 = UUID.randomUUID();
public static final UUID WORKSPACE_ID_1 = UUID.randomUUID();
private static final UUID WORKSPACE_ID_2 = UUID.randomUUID();
private static final UUID WORKSPACE_ID_3 = UUID.randomUUID();
private static final UUID WORKSPACE_CUSTOMER_ID = UUID.randomUUID();
Expand All @@ -72,8 +74,8 @@ public class MockData {
private static final UUID DESTINATION_DEFINITION_ID_2 = UUID.randomUUID();
private static final UUID DESTINATION_DEFINITION_ID_3 = UUID.randomUUID();
private static final UUID DESTINATION_DEFINITION_ID_4 = UUID.randomUUID();
private static final UUID SOURCE_ID_1 = UUID.randomUUID();
private static final UUID SOURCE_ID_2 = UUID.randomUUID();
public static final UUID SOURCE_ID_1 = UUID.randomUUID();
public static final UUID SOURCE_ID_2 = UUID.randomUUID();
private static final UUID SOURCE_ID_3 = UUID.randomUUID();
private static final UUID DESTINATION_ID_1 = UUID.randomUUID();
private static final UUID DESTINATION_ID_2 = UUID.randomUUID();
Expand All @@ -91,11 +93,12 @@ public class MockData {
private static final UUID SOURCE_OAUTH_PARAMETER_ID_2 = UUID.randomUUID();
private static final UUID DESTINATION_OAUTH_PARAMETER_ID_1 = UUID.randomUUID();
private static final UUID DESTINATION_OAUTH_PARAMETER_ID_2 = UUID.randomUUID();
private static final UUID ACTOR_CATALOG_ID_1 = UUID.randomUUID();
public static final UUID ACTOR_CATALOG_ID_1 = UUID.randomUUID();
private static final UUID ACTOR_CATALOG_ID_2 = UUID.randomUUID();
private static final UUID ACTOR_CATALOG_ID_3 = UUID.randomUUID();
public static final UUID ACTOR_CATALOG_ID_3 = UUID.randomUUID();
private static final UUID ACTOR_CATALOG_FETCH_EVENT_ID_1 = UUID.randomUUID();
private static final UUID ACTOR_CATALOG_FETCH_EVENT_ID_2 = UUID.randomUUID();
private static final UUID ACTOR_CATALOG_FETCH_EVENT_ID_3 = UUID.randomUUID();

public static final String MOCK_SERVICE_ACCOUNT_1 = "{\n"
+ " \"type\" : \"service_account\",\n"
Expand Down Expand Up @@ -622,8 +625,8 @@ public static List<ActorCatalogFetchEvent> actorCatalogFetchEvents() {
.withId(ACTOR_CATALOG_FETCH_EVENT_ID_2)
.withActorCatalogId(ACTOR_CATALOG_ID_2)
.withActorId(SOURCE_ID_2)
.withConfigHash("1394")
.withConnectorVersion("1.2.0");
.withConfigHash("1395")
.withConnectorVersion("1.42.0");
return Arrays.asList(actorCatalogFetchEvent1, actorCatalogFetchEvent2);
}

Expand All @@ -643,6 +646,42 @@ public static List<ActorCatalogFetchEvent> actorCatalogFetchEventsSameSource() {
return Arrays.asList(actorCatalogFetchEvent1, actorCatalogFetchEvent2);
}

@Data
public static class ActorCatalogFetchEventWithCreationDate {

private final ActorCatalogFetchEvent actorCatalogFetchEvent;
private final OffsetDateTime createdAt;

}

public static List<ActorCatalogFetchEventWithCreationDate> actorCatalogFetchEventsForAggregationTest() {
final OffsetDateTime now = OffsetDateTime.now();
final OffsetDateTime yesterday = OffsetDateTime.now().minusDays(1l);

final ActorCatalogFetchEvent actorCatalogFetchEvent1 = new ActorCatalogFetchEvent()
.withId(ACTOR_CATALOG_FETCH_EVENT_ID_1)
.withActorCatalogId(ACTOR_CATALOG_ID_1)
.withActorId(SOURCE_ID_1)
.withConfigHash("CONFIG_HASH")
.withConnectorVersion("1.0.0");
final ActorCatalogFetchEvent actorCatalogFetchEvent2 = new ActorCatalogFetchEvent()
.withId(ACTOR_CATALOG_FETCH_EVENT_ID_2)
.withActorCatalogId(ACTOR_CATALOG_ID_2)
.withActorId(SOURCE_ID_2)
.withConfigHash("1394")
.withConnectorVersion("1.2.0");
final ActorCatalogFetchEvent actorCatalogFetchEvent3 = new ActorCatalogFetchEvent()
.withId(ACTOR_CATALOG_FETCH_EVENT_ID_3)
.withActorCatalogId(ACTOR_CATALOG_ID_3)
.withActorId(SOURCE_ID_2)
.withConfigHash("1394")
.withConnectorVersion("1.2.0");
return Arrays.asList(
new ActorCatalogFetchEventWithCreationDate(actorCatalogFetchEvent1, now),
new ActorCatalogFetchEventWithCreationDate(actorCatalogFetchEvent2, yesterday),
new ActorCatalogFetchEventWithCreationDate(actorCatalogFetchEvent3, now));
}

public static List<WorkspaceServiceAccount> workspaceServiceAccounts() {
final WorkspaceServiceAccount workspaceServiceAccount = new WorkspaceServiceAccount()
.withWorkspaceId(WORKSPACE_ID_1)
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -99,7 +99,8 @@ public ConnectionStateType getStateType(final ConnectionIdRequestBody connection
return Enums.convertTo(stateHandler.getState(connectionIdRequestBody).getStateType(), ConnectionStateType.class);
}

public WebBackendConnectionReadList webBackendListConnectionsForWorkspace(final WorkspaceIdRequestBody workspaceIdRequestBody) throws IOException {
public WebBackendConnectionReadList webBackendListConnectionsForWorkspace(final WorkspaceIdRequestBody workspaceIdRequestBody)
throws IOException, JsonValidationException, ConfigNotFoundException {

// passing 'false' so that deleted connections are not included
final List<StandardSync> standardSyncs =
Expand All @@ -113,6 +114,9 @@ public WebBackendConnectionReadList webBackendListConnectionsForWorkspace(final
final Map<UUID, JobRead> runningJobByConnectionId =
getRunningJobByConnectionId(standardSyncs.stream().map(StandardSync::getConnectionId).toList());

final Map<UUID, ActorCatalogFetchEvent> newestFetchEventsByActorId =
configRepository.getMostRecentActorCatalogFetchEventForSources(new ArrayList<>());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@benmoriceau why are we adding more db queries into this handler? it's really specifically not supposed to be making direct database calls.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@cgardens It was not very clear that the repository shouldn't have been used here. Especially since it was recently in in the same endpoint and other endpoints to get data from the DB. The ticket related to this PR #17526 is only needing the described functionality in the webBackend endpoint only so it made sense to add that direct call in the WebBackendHandler.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Copy link
Contributor Author

@benmoriceau benmoriceau Nov 22, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This comment is easy to miss when the same field is being used within the same function here. When you are looking at the function to update, it is not explicit that this is a deprecated field. Something like this would make it very explicit that this is deprecated. It will show the getter are an error and would have prevent any use of it.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

looks good. let's do that. i also put up a PR that adds more explanation in the javadocs #19719


final List<WebBackendConnectionListItem> connectionItems = Lists.newArrayList();

for (final StandardSync standardSync : standardSyncs) {
Expand All @@ -122,7 +126,8 @@ public WebBackendConnectionReadList webBackendListConnectionsForWorkspace(final
sourceReadById,
destinationReadById,
latestJobByConnectionId,
runningJobByConnectionId));
runningJobByConnectionId,
Optional.ofNullable(newestFetchEventsByActorId.get(standardSync.getSourceId()))));
}

return new WebBackendConnectionReadList().connections(connectionItems);
Expand Down Expand Up @@ -175,51 +180,33 @@ private WebBackendConnectionRead buildWebBackendConnectionRead(final ConnectionR
webBackendConnectionRead.setLatestSyncJobStatus(job.getStatus());
});

SchemaChange schemaChange = getSchemaChange(connectionRead, currentSourceCatalogId);
final Optional<ActorCatalogFetchEvent> mostRecentFetchEvent =
configRepository.getMostRecentActorCatalogFetchEventForSource(connectionRead.getSourceId());
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This has been extracted from the getSchemaChange in order to avoid running one request per connection during a connection list.


final SchemaChange schemaChange = getSchemaChange(connectionRead, currentSourceCatalogId, mostRecentFetchEvent);

webBackendConnectionRead.setSchemaChange(schemaChange);

return webBackendConnectionRead;
}

/*
* A breakingChange boolean is stored on the connectionRead object and corresponds to the boolean
* breakingChange field on the connection table. If there is not a breaking change, we still have to
* check whether there is a non-breaking schema change by fetching the most recent
* ActorCatalogFetchEvent. A new ActorCatalogFetchEvent is stored each time there is a source schema
* refresh, so if the most recent ActorCatalogFetchEvent has a different actor catalog than the
* existing actor catalog, there is a schema change.
*/
private SchemaChange getSchemaChange(ConnectionRead connectionRead, Optional<UUID> currentSourceCatalogId) throws IOException {
SchemaChange schemaChange = SchemaChange.NO_CHANGE;

if (connectionRead.getBreakingChange()) {
schemaChange = SchemaChange.BREAKING;
} else if (currentSourceCatalogId.isPresent()) {
final Optional<ActorCatalogFetchEvent> mostRecentFetchEvent =
configRepository.getMostRecentActorCatalogFetchEventForSource(connectionRead.getSourceId());

if (mostRecentFetchEvent.isPresent()) {
if (!mostRecentFetchEvent.get().getActorCatalogId().equals(currentSourceCatalogId.get())) {
schemaChange = SchemaChange.NON_BREAKING;
}
}
}

return schemaChange;
}

private WebBackendConnectionListItem buildWebBackendConnectionListItem(
final StandardSync standardSync,
final Map<UUID, SourceRead> sourceReadById,
final Map<UUID, DestinationRead> destinationReadById,
final Map<UUID, JobRead> latestJobByConnectionId,
final Map<UUID, JobRead> runningJobByConnectionId) {
final Map<UUID, JobRead> runningJobByConnectionId,
final Optional<ActorCatalogFetchEvent> latestFetchEvent)
throws JsonValidationException, ConfigNotFoundException, IOException {

final SourceRead source = sourceReadById.get(standardSync.getSourceId());
final DestinationRead destination = destinationReadById.get(standardSync.getDestinationId());
final Optional<JobRead> latestSyncJob = Optional.ofNullable(latestJobByConnectionId.get(standardSync.getConnectionId()));
final Optional<JobRead> latestRunningSyncJob = Optional.ofNullable(runningJobByConnectionId.get(standardSync.getConnectionId()));
final ConnectionRead connectionRead = connectionsHandler.getConnection(standardSync.getConnectionId());
final Optional<UUID> currentCatalogId = connectionRead == null ? Optional.empty() : Optional.ofNullable(connectionRead.getSourceCatalogId());
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is it possible for connectionread to be null?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Nothing tells me that the field is not nullable so I prefered to handle the possibility.


final SchemaChange schemaChange = getSchemaChange(connectionRead, currentCatalogId, latestFetchEvent);

final WebBackendConnectionListItem listItem = new WebBackendConnectionListItem()
.connectionId(standardSync.getConnectionId())
Expand All @@ -230,7 +217,8 @@ private WebBackendConnectionListItem buildWebBackendConnectionListItem(
.scheduleType(ApiPojoConverters.toApiConnectionScheduleType(standardSync))
.scheduleData(ApiPojoConverters.toApiConnectionScheduleData(standardSync))
.source(source)
.destination(destination);
.destination(destination)
.schemaChange(schemaChange);

listItem.setIsSyncing(latestRunningSyncJob.isPresent());

Expand All @@ -242,6 +230,34 @@ private WebBackendConnectionListItem buildWebBackendConnectionListItem(
return listItem;
}

/*
* A breakingChange boolean is stored on the connectionRead object and corresponds to the boolean
* breakingChange field on the connection table. If there is not a breaking change, we still have to
* check whether there is a non-breaking schema change by fetching the most recent
* ActorCatalogFetchEvent. A new ActorCatalogFetchEvent is stored each time there is a source schema
* refresh, so if the most recent ActorCatalogFetchEvent has a different actor catalog than the
* existing actor catalog, there is a schema change.
*/
@VisibleForTesting
SchemaChange getSchemaChange(
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@alovew I refactor this method in order to make it easier to understand. I also added missing test to it. Please make sure that I didn't miss anything in the re-implementation.

final ConnectionRead connectionRead,
final Optional<UUID> currentSourceCatalogId,
final Optional<ActorCatalogFetchEvent> mostRecentFetchEvent) {
if (connectionRead == null || currentSourceCatalogId.isEmpty()) {
return SchemaChange.NO_CHANGE;
}

if (connectionRead.getBreakingChange() != null && connectionRead.getBreakingChange()) {
return SchemaChange.BREAKING;
}

if (mostRecentFetchEvent.isPresent() && !mostRecentFetchEvent.map(ActorCatalogFetchEvent::getActorCatalogId).equals(currentSourceCatalogId)) {
return SchemaChange.NON_BREAKING;
}

return SchemaChange.NO_CHANGE;
}

private SourceRead getSourceRead(final UUID sourceId) throws JsonValidationException, IOException, ConfigNotFoundException {
final SourceIdRequestBody sourceIdRequestBody = new SourceIdRequestBody().sourceId(sourceId);
return sourceHandler.getSource(sourceIdRequestBody);
Expand Down
Loading