Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Project factory bff #259

Closed
wants to merge 274 commits into from
Closed

Project factory bff #259

wants to merge 274 commits into from

Conversation

ashish-egov
Copy link
Contributor

@ashish-egov ashish-egov commented Apr 4, 2024

Summary by CodeRabbit

  • New Features

    • Introduced comprehensive functionalities for handling Excel files, including fetching workbook and sheet data.
    • Added capabilities for searching in MDMS, generating unique identifiers, and creating projects.
    • Enhanced resource management features, including staff and facilities management.
    • Improved data validation processes and search functionalities.
    • Implemented new functions for managing hierarchical data and relationships, particularly for boundaries.
  • Refactor

    • Optimized data processing for generating Excel sheets with conditional logic.
    • Removed unnecessary console logging.
  • Bug Fixes

    • Corrected filter validation to ensure accurate boundary and root definitions.

jagankumar-egov and others added 30 commits December 12, 2023 15:32
* Fix : Fixed duplicate caching issue in add screen

* Fix : Fixed caching issue in MDMSAddV4 screen

* feat: implemented cache deletion for specific screens
* Feat : Created Enpoints for transform data

* Feat : Created Endpoints to transform xlsx data

* Refactor : Refactored parsing logics

* Refactor: Reafactored Kafka Listener

* Refactor: Reafactored Kafka Listener

* Refactor: Refactored Config for helm chart

* Feat : Implemented Dynamic url

* Feat : Integrated multiple ParsingTemplates and ingestion message in kafka

* Feat : Integrated with ingestion api

* Feat : Integrated transform service for unified-dev
jagankumar-egov and others added 27 commits March 28, 2024 19:20
Copy link
Contributor

coderabbitai bot commented Apr 4, 2024

Walkthrough

Walkthrough

The recent updates across various utility services focus on enhancing data handling and processing capabilities, particularly around Excel file manipulation, boundary and hierarchy management, and data validation. New functionalities include the generation of unique IDs, project creation, resource management, and advanced filtering for data processing. These changes aim to streamline operations, improve data accuracy, and facilitate more efficient project and campaign management.

Changes

File Path Change Summary
.../egov-bff/src/server/api/index.ts Introduced functions for Excel file handling, MDMS searching, unique ID generation, project creation, resource management, data validation, and more.
.../project-factory/src/server/api/campaignApis.ts
.../project-factory/src/server/utils/campaignUtils.ts
Added functions for hierarchy generation and fetching, including generateHierarchy, generateHierarchyList, and getHierarchy.
.../project-factory/src/server/api/genericApis.ts
.../project-factory/src/server/utils/genericUtils.ts
Enhanced boundary data processing with new functions and updated getBoundarySheetData for conditional logic in Excel sheet generation.
.../project-factory/src/server/controllers/dataManage/dataManage.controller.ts Removed unnecessary console log statement.
.../project-factory/src/server/utils/validators/campaignValidators.ts Introduced validateFilters function for validating filter criteria.

Poem

🐇✨
In the realm of code, where data streams flow,
A rabbit hopped in, with changes in tow.
Excel sheets dance, and hierarchies grow,
Boundaries defined, where once was unknown.
With each line of code, our world expands,
Crafted by paws, in digital lands.
🌟📊

Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media?

Share

Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>.
    • Generate unit testing code for this file.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai generate unit testing code for this file.
    • @coderabbitai modularize this function.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai generate interesting stats about this repository and render them as a table.
    • @coderabbitai show all the console.log statements in this repository.
    • @coderabbitai read src/utils.ts and generate unit testing code.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments.

CodeRabbit Commands (invoked as PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger a review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai help to get help.

Additionally, you can add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.

CodeRabbit Configration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/coderabbit-overrides.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link
Contributor

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 12

Review Status

Configuration used: CodeRabbit UI

Commits Files that changed from the base of the PR and between 19665ae and f89a265.
Files ignored due to path filters (1)
  • utilities/project-factory/package.json is excluded by !**/*.json
Files selected for processing (8)
  • utilities/egov-bff/src/server/api/index.ts (1 hunks)
  • utilities/egov-bff/src/server/utils/index.ts (1 hunks)
  • utilities/project-factory/src/server/api/campaignApis.ts (3 hunks)
  • utilities/project-factory/src/server/api/genericApis.ts (3 hunks)
  • utilities/project-factory/src/server/controllers/dataManage/dataManage.controller.ts (1 hunks)
  • utilities/project-factory/src/server/utils/campaignUtils.ts (3 hunks)
  • utilities/project-factory/src/server/utils/genericUtils.ts (4 hunks)
  • utilities/project-factory/src/server/utils/validators/campaignValidators.ts (2 hunks)
Files not summarized due to errors (1)
  • utilities/egov-bff/src/server/utils/index.ts: Error: Message exceeds token limit
Files skipped from review due to trivial changes (1)
  • utilities/project-factory/src/server/controllers/dataManage/dataManage.controller.ts
Additional comments not posted (9)
utilities/project-factory/src/server/api/campaignApis.ts (2)

379-396: LGTM! The generateHierarchyList function correctly generates hierarchy chains from boundary data.


9-9: Acknowledged the mention of generateHierarchy function added in campaignUtils. Ensure its implementation aligns with the intended functionality.

utilities/project-factory/src/server/api/genericApis.ts (1)

6-8: Acknowledged the addition of getBoundaryRelationshipData and generateFilteredBoundaryData. Ensure their implementations align with the intended functionality and error handling is adequately addressed.

utilities/project-factory/src/server/utils/campaignUtils.ts (1)

666-677: Ensure validateFilters function properly handles all possible edge cases and inputs.

utilities/egov-bff/src/server/api/index.ts (4)

61-82: Be cautious with logging the search result in the searchMDMS function to avoid potential data leakage. Verify the consistency of the error structure to ensure safe access to error details.


87-106: Consider removing or modifying the log statement that logs the request data in the getCampaignNumber function to avoid potential data leakage. Ensure that the httpRequest function used for making the request has adequate error handling.


176-192: Ensure that the httpRequest function used in createAndUploadFile for uploading the file has adequate error handling. Consider adding explicit error handling in this function if necessary.


194-211: LGTM! The generateHierarchyList function correctly implements recursion to generate hierarchy chains and follows best practices.

utilities/egov-bff/src/server/utils/index.ts (1)

94-99: Ensure proper validation of the cache key in the cacheResponse function to prevent potential security risks.

Consider adding a validation step to ensure that the key does not contain any sensitive information or patterns that could be exploited.

Comment on lines +681 to +685
async function getBoundaryRelationshipData(request: any, params: any) {
const url = `${config.host.boundaryHost}${config.paths.boundaryRelationship}`;
const boundaryRelationshipResponse = await httpRequest(url, request.body, params);
return boundaryRelationshipResponse?.TenantBoundary?.[0]?.boundary;
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding error handling for the external API call in getBoundaryRelationshipData.

  const boundaryRelationshipResponse = await httpRequest(url, request.body, params).catch(e => {
    logger.error(`Failed to fetch boundary relationship data: ${e.message}`);
    throw new Error(`Failed to fetch boundary relationship data: ${e.message}`);
  });

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
async function getBoundaryRelationshipData(request: any, params: any) {
const url = `${config.host.boundaryHost}${config.paths.boundaryRelationship}`;
const boundaryRelationshipResponse = await httpRequest(url, request.body, params);
return boundaryRelationshipResponse?.TenantBoundary?.[0]?.boundary;
}
async function getBoundaryRelationshipData(request: any, params: any) {
const url = `${config.host.boundaryHost}${config.paths.boundaryRelationship}`;
const boundaryRelationshipResponse = await httpRequest(url, request.body, params).catch(e => {
logger.error(`Failed to fetch boundary relationship data: ${e.message}`);
throw new Error(`Failed to fetch boundary relationship data: ${e.message}`);
});
return boundaryRelationshipResponse?.TenantBoundary?.[0]?.boundary;
}

Comment on lines +687 to +721
async function getDataSheetReady(boundaryData: any, request: any) {
if (!boundaryData) {
throw new Error("No boundary data provided.");
}
const boundaryType = boundaryData?.[0].boundaryType;
const boundaryList = generateHierarchyList(boundaryData)
if (!Array.isArray(boundaryList) || boundaryList.length === 0) {
throw new Error("Boundary list is empty or not an array.");
}
const boundaryCodes = boundaryList.map(boundary => boundary.split(',').pop());
const string = boundaryCodes.join(', ');
const boundaryEntityResponse = await httpRequest(config.host.boundaryHost + config.paths.boundaryServiceSearch, request.body, { tenantId: "pg", codes: string });

const boundaryCodeNameMapping: { [key: string]: string } = {};
boundaryEntityResponse?.Boundary?.forEach((data: any) => {
boundaryCodeNameMapping[data?.code] = data?.additionalDetails?.name;
});

const hierarchy = await getHierarchy(request, request?.query?.tenantId, request?.query?.hierarchyType);
const startIndex = boundaryType ? hierarchy.indexOf(boundaryType) : -1;
const reducedHierarchy = startIndex !== -1 ? hierarchy.slice(startIndex) : hierarchy;
const headers = [...reducedHierarchy, "Boundary Code", "Target at the Selected Boundary level", "Start Date of Campaign (Optional Field)", "End Date of Campaign (Optional Field)"];
const data = boundaryList.map(boundary => {
const boundaryParts = boundary.split(',');
const boundaryCode = boundaryParts[boundaryParts.length - 1];
const rowData = boundaryParts.concat(Array(Math.max(0, reducedHierarchy.length - boundaryParts.length)).fill(''));
const mappedRowData = rowData.map((cell: any, index: number) =>
index === reducedHierarchy.length ? '' : cell !== '' ? boundaryCodeNameMapping[cell] || cell : ''
);
const boundaryCodeIndex = reducedHierarchy.length;
mappedRowData[boundaryCodeIndex] = boundaryCode;
return mappedRowData;
});
return await createExcelSheet(data, headers);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor getDataSheetReady to improve error handling and readability.

-  if (!boundaryData) {
-    throw new Error("No boundary data provided.");
-  }
+  if (!Array.isArray(boundaryData) || boundaryData.length === 0) {
+    throw new Error("Boundary data is empty or not provided.");
+  }

Also, consider validating boundaryType and hierarchy to ensure they are not undefined before using them.


Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
async function getDataSheetReady(boundaryData: any, request: any) {
if (!boundaryData) {
throw new Error("No boundary data provided.");
}
const boundaryType = boundaryData?.[0].boundaryType;
const boundaryList = generateHierarchyList(boundaryData)
if (!Array.isArray(boundaryList) || boundaryList.length === 0) {
throw new Error("Boundary list is empty or not an array.");
}
const boundaryCodes = boundaryList.map(boundary => boundary.split(',').pop());
const string = boundaryCodes.join(', ');
const boundaryEntityResponse = await httpRequest(config.host.boundaryHost + config.paths.boundaryServiceSearch, request.body, { tenantId: "pg", codes: string });
const boundaryCodeNameMapping: { [key: string]: string } = {};
boundaryEntityResponse?.Boundary?.forEach((data: any) => {
boundaryCodeNameMapping[data?.code] = data?.additionalDetails?.name;
});
const hierarchy = await getHierarchy(request, request?.query?.tenantId, request?.query?.hierarchyType);
const startIndex = boundaryType ? hierarchy.indexOf(boundaryType) : -1;
const reducedHierarchy = startIndex !== -1 ? hierarchy.slice(startIndex) : hierarchy;
const headers = [...reducedHierarchy, "Boundary Code", "Target at the Selected Boundary level", "Start Date of Campaign (Optional Field)", "End Date of Campaign (Optional Field)"];
const data = boundaryList.map(boundary => {
const boundaryParts = boundary.split(',');
const boundaryCode = boundaryParts[boundaryParts.length - 1];
const rowData = boundaryParts.concat(Array(Math.max(0, reducedHierarchy.length - boundaryParts.length)).fill(''));
const mappedRowData = rowData.map((cell: any, index: number) =>
index === reducedHierarchy.length ? '' : cell !== '' ? boundaryCodeNameMapping[cell] || cell : ''
);
const boundaryCodeIndex = reducedHierarchy.length;
mappedRowData[boundaryCodeIndex] = boundaryCode;
return mappedRowData;
});
return await createExcelSheet(data, headers);
}
async function getDataSheetReady(boundaryData: any, request: any) {
if (!Array.isArray(boundaryData) || boundaryData.length === 0) {
throw new Error("Boundary data is empty or not provided.");
}
const boundaryType = boundaryData?.[0].boundaryType;
const boundaryList = generateHierarchyList(boundaryData)
if (!Array.isArray(boundaryList) || boundaryList.length === 0) {
throw new Error("Boundary list is empty or not an array.");
}
const boundaryCodes = boundaryList.map(boundary => boundary.split(',').pop());
const string = boundaryCodes.join(', ');
const boundaryEntityResponse = await httpRequest(config.host.boundaryHost + config.paths.boundaryServiceSearch, request.body, { tenantId: "pg", codes: string });
const boundaryCodeNameMapping: { [key: string]: string } = {};
boundaryEntityResponse?.Boundary?.forEach((data: any) => {
boundaryCodeNameMapping[data?.code] = data?.additionalDetails?.name;
});
const hierarchy = await getHierarchy(request, request?.query?.tenantId, request?.query?.hierarchyType);
const startIndex = boundaryType ? hierarchy.indexOf(boundaryType) : -1;
const reducedHierarchy = startIndex !== -1 ? hierarchy.slice(startIndex) : hierarchy;
const headers = [...reducedHierarchy, "Boundary Code", "Target at the Selected Boundary level", "Start Date of Campaign (Optional Field)", "End Date of Campaign (Optional Field)"];
const data = boundaryList.map(boundary => {
const boundaryParts = boundary.split(',');
const boundaryCode = boundaryParts[boundaryParts.length - 1];
const rowData = boundaryParts.concat(Array(Math.max(0, reducedHierarchy.length - boundaryParts.length)).fill(''));
const mappedRowData = rowData.map((cell: any, index: number) =>
index === reducedHierarchy.length ? '' : cell !== '' ? boundaryCodeNameMapping[cell] || cell : ''
);
const boundaryCodeIndex = reducedHierarchy.length;
mappedRowData[boundaryCodeIndex] = boundaryCode;
return mappedRowData;
});
return await createExcelSheet(data, headers);
}

Comment on lines +39 to +50
const throwError = (
message = "Internal Server Error",
code = "INTERNAL_SERVER_ERROR",
status = 500
) => {
let error = new Error(message);
// error.status = status;
// error.code = code;
logger.error("Error : " + error);

throw error;
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider adding error code and status to the thrown error in the throwError function.

  let error = new Error(message);
+ error.status = status;
+ error.code = code;

This change ensures that the error object contains all relevant information, making it easier to handle downstream.


Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
const throwError = (
message = "Internal Server Error",
code = "INTERNAL_SERVER_ERROR",
status = 500
) => {
let error = new Error(message);
// error.status = status;
// error.code = code;
logger.error("Error : " + error);
throw error;
};
const throwError = (
message = "Internal Server Error",
code = "INTERNAL_SERVER_ERROR",
status = 500
) => {
let error = new Error(message);
error.status = status;
error.code = code;
logger.error("Error : " + error);
throw error;
};

Comment on lines +188 to +215
async function generateXlsxFromJson(request: any, response: any, simplifiedData: any) {
try {
const ws = XLSX.utils.json_to_sheet(simplifiedData);

// Create a new workbook
const wb = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(wb, ws, 'Sheet 1');
const buffer = XLSX.write(wb, { bookType: 'xlsx', type: 'buffer' });
const formData = new FormData();
formData.append('file', buffer, 'filename.xlsx');
formData.append('tenantId', request?.body?.RequestInfo?.userInfo?.tenantId);
formData.append('module', 'pgr');

logger.info("File uploading url : " + config.host.filestore + config.paths.filestore);
var fileCreationResult = await httpRequest(config.host.filestore + config.paths.filestore, formData, undefined, undefined, undefined,
{
'Content-Type': 'multipart/form-data',
'auth-token': request?.body?.RequestInfo?.authToken
}
);
const responseData = fileCreationResult?.files;
logger.info("Response data after File Creation : " + JSON.stringify(responseData));
return responseData;
} catch (e: any) {
const errorMessage = "Error occurred while fetching the file store ID: " + e.message;
logger.error(errorMessage)
return errorResponder({ message: errorMessage + " Check Logs" }, request, response);
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Add error handling for potential failures in external calls within the generateXlsxFromJson function.

  } catch (e: any) {
+   logger.error("Failed to generate XLSX from JSON: ", e);
    const errorMessage = "Error occurred while fetching the file store ID: " + e.message;
    logger.error(errorMessage)
    return errorResponder({ message: errorMessage + "    Check Logs" }, request, response);
  }

This ensures that any errors encountered during the process are logged, providing more context for debugging.


Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
async function generateXlsxFromJson(request: any, response: any, simplifiedData: any) {
try {
const ws = XLSX.utils.json_to_sheet(simplifiedData);
// Create a new workbook
const wb = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(wb, ws, 'Sheet 1');
const buffer = XLSX.write(wb, { bookType: 'xlsx', type: 'buffer' });
const formData = new FormData();
formData.append('file', buffer, 'filename.xlsx');
formData.append('tenantId', request?.body?.RequestInfo?.userInfo?.tenantId);
formData.append('module', 'pgr');
logger.info("File uploading url : " + config.host.filestore + config.paths.filestore);
var fileCreationResult = await httpRequest(config.host.filestore + config.paths.filestore, formData, undefined, undefined, undefined,
{
'Content-Type': 'multipart/form-data',
'auth-token': request?.body?.RequestInfo?.authToken
}
);
const responseData = fileCreationResult?.files;
logger.info("Response data after File Creation : " + JSON.stringify(responseData));
return responseData;
} catch (e: any) {
const errorMessage = "Error occurred while fetching the file store ID: " + e.message;
logger.error(errorMessage)
return errorResponder({ message: errorMessage + " Check Logs" }, request, response);
}
async function generateXlsxFromJson(request: any, response: any, simplifiedData: any) {
try {
const ws = XLSX.utils.json_to_sheet(simplifiedData);
// Create a new workbook
const wb = XLSX.utils.book_new();
XLSX.utils.book_append_sheet(wb, ws, 'Sheet 1');
const buffer = XLSX.write(wb, { bookType: 'xlsx', type: 'buffer' });
const formData = new FormData();
formData.append('file', buffer, 'filename.xlsx');
formData.append('tenantId', request?.body?.RequestInfo?.userInfo?.tenantId);
formData.append('module', 'pgr');
logger.info("File uploading url : " + config.host.filestore + config.paths.filestore);
var fileCreationResult = await httpRequest(config.host.filestore + config.paths.filestore, formData, undefined, undefined, undefined,
{
'Content-Type': 'multipart/form-data',
'auth-token': request?.body?.RequestInfo?.authToken
}
);
const responseData = fileCreationResult?.files;
logger.info("Response data after File Creation : " + JSON.stringify(responseData));
return responseData;
} catch (e: any) {
logger.error("Failed to generate XLSX from JSON: ", e);
const errorMessage = "Error occurred while fetching the file store ID: " + e.message;
logger.error(errorMessage)
return errorResponder({ message: errorMessage + " Check Logs" }, request, response);
}
}

Comment on lines +361 to +381
function validateFilters(request: any) {
const boundaries = request?.body?.Filters?.boundaries;

if (!Array.isArray(boundaries)) {
throw new Error("Boundaries should be an array");
}

const rootBoundaries = boundaries.filter((boundary: any) => boundary.isRoot);

if (rootBoundaries.length !== 1) {
throw new Error("Invalid Filter Criteria: Either more than one root or no root present");
}

const boundaryTypeOfRoot = rootBoundaries[0]?.boundaryType;

const boundariesOfTypeOfSameAsRoot = boundaries.filter((boundary: any) => boundary.boundaryType === boundaryTypeOfRoot);

if (boundariesOfTypeOfSameAsRoot.length > 1) {
throw new Error("Invalid Filter Criteria: Only one root can be there for a given Boundary level");
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider enhancing the error messages in validateFilters to provide more context and refactoring for improved readability.

function validateFilters(request: any) {
    const boundaries = request?.body?.Filters?.boundaries;
  
    if (!Array.isArray(boundaries)) {
-      throw new Error("Boundaries should be an array");
+      throw new Error("Invalid Filter Criteria: Boundaries should be provided as an array.");
    }
  
    const rootBoundaries = boundaries.filter((boundary: any) => boundary.isRoot);
  
    if (rootBoundaries.length !== 1) {
-      throw new Error("Invalid Filter Criteria: Either more than one root or no root present");
+      throw new Error("Invalid Filter Criteria: Exactly one root boundary is required.");
    }
  
    const boundaryTypeOfRoot = rootBoundaries[0]?.boundaryType;
    const boundariesOfTypeOfSameAsRoot = boundaries.filter((boundary: any) => boundary.boundaryType === boundaryTypeOfRoot);
  
    if (boundariesOfTypeOfSameAsRoot.length > 1) {
-      throw new Error("Invalid Filter Criteria: Only one root can be there for a given Boundary level");
+      throw new Error("Invalid Filter Criteria: Multiple root boundaries found for the same boundary level.");
    }
}

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
function validateFilters(request: any) {
const boundaries = request?.body?.Filters?.boundaries;
if (!Array.isArray(boundaries)) {
throw new Error("Boundaries should be an array");
}
const rootBoundaries = boundaries.filter((boundary: any) => boundary.isRoot);
if (rootBoundaries.length !== 1) {
throw new Error("Invalid Filter Criteria: Either more than one root or no root present");
}
const boundaryTypeOfRoot = rootBoundaries[0]?.boundaryType;
const boundariesOfTypeOfSameAsRoot = boundaries.filter((boundary: any) => boundary.boundaryType === boundaryTypeOfRoot);
if (boundariesOfTypeOfSameAsRoot.length > 1) {
throw new Error("Invalid Filter Criteria: Only one root can be there for a given Boundary level");
}
}
function validateFilters(request: any) {
const boundaries = request?.body?.Filters?.boundaries;
if (!Array.isArray(boundaries)) {
throw new Error("Invalid Filter Criteria: Boundaries should be provided as an array.");
}
const rootBoundaries = boundaries.filter((boundary: any) => boundary.isRoot);
if (rootBoundaries.length !== 1) {
throw new Error("Invalid Filter Criteria: Exactly one root boundary is required.");
}
const boundaryTypeOfRoot = rootBoundaries[0]?.boundaryType;
const boundariesOfTypeOfSameAsRoot = boundaries.filter((boundary: any) => boundary.boundaryType === boundaryTypeOfRoot);
if (boundariesOfTypeOfSameAsRoot.length > 1) {
throw new Error("Invalid Filter Criteria: Multiple root boundaries found for the same boundary level.");
}
}

Comment on lines +29 to +55
}
const getSheetData = async (fileUrl: string, sheetName: string, getRow = false) => {
const headers = {
'Content-Type': 'application/json',
Accept: 'application/pdf',
};

const responseFile = await httpRequest(fileUrl, null, {}, 'get', 'arraybuffer', headers);
const workbook = XLSX.read(responseFile, { type: 'buffer' });

// Check if the specified sheet exists in the workbook
if (!workbook.Sheets.hasOwnProperty(sheetName)) {
throw new Error(`Sheet with name "${sheetName}" is not present in the file.`);
}

const sheetData = XLSX.utils.sheet_to_json(workbook.Sheets[sheetName]);
const jsonData = sheetData.map((row: any, index: number) => {
const rowData: any = {};
Object.keys(row).forEach(key => {
rowData[key] = row[key] === undefined || row[key] === '' ? null : row[key];
});
if (getRow) rowData['!row#number!'] = index + 1; // Adding row number
return rowData;
});
logger.info("Sheet Data : " + JSON.stringify(jsonData))
return jsonData;
};
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider removing or modifying the log statement that logs the entire sheet data to avoid potential data leakage and performance issues.

-  logger.info("Sheet Data : " + JSON.stringify(jsonData))

Also, improve error handling to include the original error message for better debugging.


Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
}
const getSheetData = async (fileUrl: string, sheetName: string, getRow = false) => {
const headers = {
'Content-Type': 'application/json',
Accept: 'application/pdf',
};
const responseFile = await httpRequest(fileUrl, null, {}, 'get', 'arraybuffer', headers);
const workbook = XLSX.read(responseFile, { type: 'buffer' });
// Check if the specified sheet exists in the workbook
if (!workbook.Sheets.hasOwnProperty(sheetName)) {
throw new Error(`Sheet with name "${sheetName}" is not present in the file.`);
}
const sheetData = XLSX.utils.sheet_to_json(workbook.Sheets[sheetName]);
const jsonData = sheetData.map((row: any, index: number) => {
const rowData: any = {};
Object.keys(row).forEach(key => {
rowData[key] = row[key] === undefined || row[key] === '' ? null : row[key];
});
if (getRow) rowData['!row#number!'] = index + 1; // Adding row number
return rowData;
});
logger.info("Sheet Data : " + JSON.stringify(jsonData))
return jsonData;
};
}
const getSheetData = async (fileUrl: string, sheetName: string, getRow = false) => {
const headers = {
'Content-Type': 'application/json',
Accept: 'application/pdf',
};
const responseFile = await httpRequest(fileUrl, null, {}, 'get', 'arraybuffer', headers);
const workbook = XLSX.read(responseFile, { type: 'buffer' });
// Check if the specified sheet exists in the workbook
if (!workbook.Sheets.hasOwnProperty(sheetName)) {
throw new Error(`Sheet with name "${sheetName}" is not present in the file.`);
}
const sheetData = XLSX.utils.sheet_to_json(workbook.Sheets[sheetName]);
const jsonData = sheetData.map((row: any, index: number) => {
const rowData: any = {};
Object.keys(row).forEach(key => {
rowData[key] = row[key] === undefined || row[key] === '' ? null : row[key];
});
if (getRow) rowData['!row#number!'] = index + 1; // Adding row number
return rowData;
});
return jsonData;
};

Comment on lines +1437 to +1458
for (const item of boundaryData) {
if (item.District && !uniqueDistricts.includes(item.District)) {
uniqueDistricts.push(item.District);
}
}
for (const district of uniqueDistricts) {
const districtDataFiltered = boundaryData.filter(item => item.District === district);
const districtIndex = Object.keys(districtDataFiltered[0]).indexOf('District');
const headers = Object.keys(districtDataFiltered[0]).slice(districtIndex);
const newSheetData = [headers];
for (const data of districtDataFiltered) {
const rowData = Object.values(data).slice(districtIndex).map(value => value === null ? '' : String(value)); // Replace null with empty string
newSheetData.push(rowData);
}
const ws = XLSX.utils.aoa_to_sheet(newSheetData);
XLSX.utils.book_append_sheet(workbook, ws, district);
}
return workbook;
} catch (error) {
throw Error("An error occurred while appending sheets:");
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor the appendSheetsToWorkbook function to improve readability and error handling.

  try {
    const uniqueDistricts: string[] = [];
    const uniqueDistrictsForMainSheet: string[] = [];
    const workbook = XLSX.utils.book_new();
    const mainSheetData: any[] = [];
    const headersForMainSheet = Object.keys(boundaryData[0]);
    mainSheetData.push(headersForMainSheet);
    // Existing logic...
  } catch (error) {
-   throw Error("An error occurred while appending sheets:");
+   logger.error("An error occurred while appending sheets: ", error);
+   throw error;
  }

This change ensures that errors are logged for easier debugging and that the original error is thrown to preserve the stack trace.


Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
for (const item of boundaryData) {
if (item.District && !uniqueDistricts.includes(item.District)) {
uniqueDistricts.push(item.District);
}
}
for (const district of uniqueDistricts) {
const districtDataFiltered = boundaryData.filter(item => item.District === district);
const districtIndex = Object.keys(districtDataFiltered[0]).indexOf('District');
const headers = Object.keys(districtDataFiltered[0]).slice(districtIndex);
const newSheetData = [headers];
for (const data of districtDataFiltered) {
const rowData = Object.values(data).slice(districtIndex).map(value => value === null ? '' : String(value)); // Replace null with empty string
newSheetData.push(rowData);
}
const ws = XLSX.utils.aoa_to_sheet(newSheetData);
XLSX.utils.book_append_sheet(workbook, ws, district);
}
return workbook;
} catch (error) {
throw Error("An error occurred while appending sheets:");
}
}
try {
const uniqueDistricts: string[] = [];
const uniqueDistrictsForMainSheet: string[] = [];
const workbook = XLSX.utils.book_new();
const mainSheetData: any[] = [];
const headersForMainSheet = Object.keys(boundaryData[0]);
mainSheetData.push(headersForMainSheet);
for (const item of boundaryData) {
if (item.District && !uniqueDistricts.includes(item.District)) {
uniqueDistricts.push(item.District);
}
}
for (const district of uniqueDistricts) {
const districtDataFiltered = boundaryData.filter(item => item.District === district);
const districtIndex = Object.keys(districtDataFiltered[0]).indexOf('District');
const headers = Object.keys(districtDataFiltered[0]).slice(districtIndex);
const newSheetData = [headers];
for (const data of districtDataFiltered) {
const rowData = Object.values(data).slice(districtIndex).map(value => value === null ? '' : String(value)); // Replace null with empty string
newSheetData.push(rowData);
}
const ws = XLSX.utils.aoa_to_sheet(newSheetData);
XLSX.utils.book_append_sheet(workbook, ws, district);
}
return workbook;
} catch (error) {
logger.error("An error occurred while appending sheets: ", error);
throw error;
}
}

Comment on lines +347 to +358
const params = {
...request?.query,
includeChildren: true
};
const boundaryData = await getBoundaryRelationshipData(request, params);
logger.info("boundaryData for sheet " + JSON.stringify(boundaryData))
if (boundaryData) {
const boundaryList = generateHierarchyList(boundaryData)
if (Array.isArray(boundaryList) && boundaryList.length > 0) {
const boundaryCodes = boundaryList.map(boundary => boundary.split(',').pop());
const string = boundaryCodes.join(', ');
const boundaryEntityResponse = await httpRequest(config.host.boundaryHost + config.paths.boundaryServiceSearch, request.body, { tenantId: "pg", codes: string });

const boundaryCodeNameMapping: { [key: string]: string } = {};
boundaryEntityResponse?.Boundary?.forEach((data: any) => {
boundaryCodeNameMapping[data?.code] = data?.additionalDetails?.name;
});

const hierarchy = await getHierarchy(request, request?.query?.tenantId, request?.query?.hierarchyType);
const startIndex = boundaryType ? hierarchy.indexOf(boundaryType) : -1;
const reducedHierarchy = startIndex !== -1 ? hierarchy.slice(startIndex) : hierarchy;
const headers = [...reducedHierarchy, "Boundary Code", "Target at the Selected Boundary level", "Start Date of Campaign (Optional Field)", "End Date of Campaign (Optional Field)"];
const data = boundaryList.map(boundary => {
const boundaryParts = boundary.split(',');
const boundaryCode = boundaryParts[boundaryParts.length - 1];
const rowData = boundaryParts.concat(Array(Math.max(0, reducedHierarchy.length - boundaryParts.length)).fill(''));
const mappedRowData = rowData.map((cell: any, index: number) =>
index === reducedHierarchy.length ? '' : cell !== '' ? boundaryCodeNameMapping[cell] || cell : ''
);
const boundaryCodeIndex = reducedHierarchy.length;
mappedRowData[boundaryCodeIndex] = boundaryCode;
return mappedRowData;
});
return await createExcelSheet(data, headers);
}
if (request?.body?.Filters) {
const filteredBoundaryData = await generateFilteredBoundaryData(request);
return await getDataSheetReady(filteredBoundaryData, request);
}
else {
return await getDataSheetReady(boundaryData, request);
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor getBoundarySheetData for improved readability by reducing nested conditions.

async function getBoundarySheetData(request: any) {
    const params = { ...request?.query, includeChildren: true };
    const boundaryData = await getBoundaryRelationshipData(request, params);
    logger.info("boundaryData for sheet " + JSON.stringify(boundaryData));

+   const dataToProcess = request?.body?.Filters ? await generateFilteredBoundaryData(request) : boundaryData;
-   if (request?.body?.Filters) {
-       const filteredBoundaryData = await generateFilteredBoundaryData(request);
-       return await getDataSheetReady(filteredBoundaryData, request);
-   } else {
-       return await getDataSheetReady(boundaryData, request);
-   }
+   return await getDataSheetReady(dataToProcess, request);
}

Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation.

Suggested change
const params = {
...request?.query,
includeChildren: true
};
const boundaryData = await getBoundaryRelationshipData(request, params);
logger.info("boundaryData for sheet " + JSON.stringify(boundaryData))
if (boundaryData) {
const boundaryList = generateHierarchyList(boundaryData)
if (Array.isArray(boundaryList) && boundaryList.length > 0) {
const boundaryCodes = boundaryList.map(boundary => boundary.split(',').pop());
const string = boundaryCodes.join(', ');
const boundaryEntityResponse = await httpRequest(config.host.boundaryHost + config.paths.boundaryServiceSearch, request.body, { tenantId: "pg", codes: string });
const boundaryCodeNameMapping: { [key: string]: string } = {};
boundaryEntityResponse?.Boundary?.forEach((data: any) => {
boundaryCodeNameMapping[data?.code] = data?.additionalDetails?.name;
});
const hierarchy = await getHierarchy(request, request?.query?.tenantId, request?.query?.hierarchyType);
const startIndex = boundaryType ? hierarchy.indexOf(boundaryType) : -1;
const reducedHierarchy = startIndex !== -1 ? hierarchy.slice(startIndex) : hierarchy;
const headers = [...reducedHierarchy, "Boundary Code", "Target at the Selected Boundary level", "Start Date of Campaign (Optional Field)", "End Date of Campaign (Optional Field)"];
const data = boundaryList.map(boundary => {
const boundaryParts = boundary.split(',');
const boundaryCode = boundaryParts[boundaryParts.length - 1];
const rowData = boundaryParts.concat(Array(Math.max(0, reducedHierarchy.length - boundaryParts.length)).fill(''));
const mappedRowData = rowData.map((cell: any, index: number) =>
index === reducedHierarchy.length ? '' : cell !== '' ? boundaryCodeNameMapping[cell] || cell : ''
);
const boundaryCodeIndex = reducedHierarchy.length;
mappedRowData[boundaryCodeIndex] = boundaryCode;
return mappedRowData;
});
return await createExcelSheet(data, headers);
}
if (request?.body?.Filters) {
const filteredBoundaryData = await generateFilteredBoundaryData(request);
return await getDataSheetReady(filteredBoundaryData, request);
}
else {
return await getDataSheetReady(boundaryData, request);
const params = {
...request?.query,
includeChildren: true
};
const boundaryData = await getBoundaryRelationshipData(request, params);
logger.info("boundaryData for sheet " + JSON.stringify(boundaryData))
const dataToProcess = request?.body?.Filters ? await generateFilteredBoundaryData(request) : boundaryData;
return await getDataSheetReady(dataToProcess, request);

Comment on lines +679 to +731
function filterBoundaries(boundaryData: any[], filters: any): any {
function filterRecursive(boundary: any): any {
const boundaryFilters = filters && filters.boundaries; // Accessing boundaries array from filters object
const filter = boundaryFilters?.find((f: any) => f.code === boundary.code && f.boundaryType === boundary.boundaryType);

if (!filter) {
return {
...boundary,
children: boundary.children.map(filterRecursive)
};
}

if (!boundary.children.length) {
if (!filter.includeAllChildren) {
throw new Error("Boundary cannot have includeAllChildren filter false if it does not have any children");
}
// If boundary has no children and includeAllChildren is true, return as is
return {
...boundary,
children: []
};
}

if (filter.includeAllChildren) {
// If includeAllChildren is true, return boundary with all children
return {
...boundary,
children: boundary.children.map(filterRecursive)
};
}

const filteredChildren: any[] = [];
boundary.children.forEach((child: any) => {
const matchingFilter = boundaryFilters.find((f: any) => f.code === child.code && f.boundaryType === child.boundaryType);
if (matchingFilter) {
filteredChildren.push(filterRecursive(child));
}
});
return {
...boundary,
children: filteredChildren
};
}
try {
const filteredData = boundaryData.map(filterRecursive);
return filteredData;
}
catch (e: any) {
const errorMessage = "Error occurred while fetching boundaries: " + e.message;
logger.error(errorMessage)
throw new Error(errorMessage);
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Refactor filterBoundaries to simplify logic and improve readability.
Consider using a more declarative approach for filtering and mapping the boundaries.

Comment on lines +733 to +770
function generateHierarchy(boundaries: any[]) {
// Create an object to store boundary types and their parents
const parentMap: any = {};

// Populate the object with boundary types and their parents
for (const boundary of boundaries) {
parentMap[boundary.boundaryType] = boundary.parentBoundaryType;
}

// Traverse the hierarchy to generate the hierarchy list
const hierarchyList = [];
for (const boundaryType in parentMap) {
if (Object.prototype.hasOwnProperty.call(parentMap, boundaryType)) {
const parentBoundaryType = parentMap[boundaryType];
if (parentBoundaryType === null) {
// This boundary type has no parent, add it to the hierarchy list
hierarchyList.push(boundaryType);
// Traverse its children recursively
traverseChildren(boundaryType, parentMap, hierarchyList);
}
}
}
return hierarchyList;
}

function traverseChildren(parent: any, parentMap: any, hierarchyList: any[]) {
for (const boundaryType in parentMap) {
if (Object.prototype.hasOwnProperty.call(parentMap, boundaryType)) {
const parentBoundaryType = parentMap[boundaryType];
if (parentBoundaryType === parent) {
// This boundary type has the current parent, add it to the hierarchy list
hierarchyList.push(boundaryType);
// Traverse its children recursively
traverseChildren(boundaryType, parentMap, hierarchyList);
}
}
}
}
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Tip

Codebase Verification

The provided implementation of generateHierarchy and traverseChildren functions correctly establishes parent-child relationships by building a hierarchy list from a set of boundaries. However, there is no explicit mechanism observed in the code to detect or prevent cycles in the hierarchy. This means that while the functions are designed to traverse and organize boundaries into a hierarchy based on their parent-child relationships, they do not explicitly handle the case where cycles might exist in the input data.

Analysis chain

In generateHierarchy, verify that the parent-child relationships are correctly established and that there are no cycles in the hierarchy.

Scripts Executed

The following scripts were executed for the analysis:


Script:

#!/bin/bash
# Verify that generateHierarchy correctly handles parent-child relationships without creating cycles
grep -C3 "function generateHierarchy" utilities/project-factory/src/server/utils/campaignUtils.ts

Length of output: 254



Script:

#!/bin/bash
# Extract the full implementation of generateHierarchy and traverseChildren functions
grep -A30 "function generateHierarchy" utilities/project-factory/src/server/utils/campaignUtils.ts
grep -A20 "function traverseChildren" utilities/project-factory/src/server/utils/campaignUtils.ts

Length of output: 2334

@ashish-egov ashish-egov closed this Apr 4, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants