Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Create LOCALSETUP.md #709

Merged
merged 10 commits into from
May 28, 2024
32 changes: 32 additions & 0 deletions utilities/project-factory/LOCALSETUP.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,32 @@
# Local Setup

To setup the ProjectFactory service in your local system, clone the [Digit Frontend repository](https://github.com/egovernments/DIGIT-Frontend).
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved

## Dependencies

### Infra Dependency
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved

- [x] Postgres DB
- [ ] Redis
- [ ] Elasticsearch
- [x] Kafka
- [x] Consumer
- [x] Producer

## Running Locally

### Local setup
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
1. To setup the ProjectFactory service, clone the [Digit Frontend repository](https://github.com/egovernments/DIGIT-Frontend).
2. Install Node.js version 20 using nvm (Node Version Manager).
3. Update the configs in utilities/project-factory/src/server/config/index.ts, change HOST to "http://localhost:8080/" and KAFKA_BROKER_HOST to "localhost:9092".
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
4. Also update DB config values as per your local system config.
5. Update all dependency service host either on any unified-env or port-forward.
6. Open the terminal and run the following command

`cd utilities/project-factory/`

ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
`yarn install` (run this command only once when you clone the repo)

ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
`yarn dev`

> Note: After running the above command if kafka error comes then make sure that kafka and zookeeper runs in background and if other microservice connection error comes then make sure that in data config the url mentioned in external mapping is correct or you can port-forward that particular service
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
10 changes: 5 additions & 5 deletions utilities/project-factory/src/server/config/dbPoolConfig.ts
Original file line number Diff line number Diff line change
Expand Up @@ -2,11 +2,11 @@ import { Pool } from 'pg';
import config from '.';

const pool = new Pool({
user: config.DB_USER,
host: config.DB_HOST,
database: config.DB_NAME,
password: config.DB_PASSWORD,
port: parseInt(config.DB_PORT)
user: config.DB_CONFIG.DB_USER,
host: config.DB_CONFIG.DB_HOST,
database: config.DB_CONFIG.DB_NAME,
password: config.DB_CONFIG.DB_PASSWORD,
port: parseInt(config.DB_CONFIG.DB_PORT)
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
});

export default pool;
Expand Down
25 changes: 10 additions & 15 deletions utilities/project-factory/src/server/config/index.ts
Original file line number Diff line number Diff line change
Expand Up @@ -45,14 +45,15 @@ const config = {
KAFKA_CREATE_RESOURCE_ACTIVITY_TOPIC: process.env.KAFKA_CREATE_RESOURCE_ACTIVITY_TOPIC || "create-resource-activity",
KAFKA_UPDATE_GENERATED_RESOURCE_DETAILS_TOPIC: process.env.KAFKA_UPDATE_GENERATED_RESOURCE_DETAILS_TOPIC || "update-generated-resource-details",
KAFKA_CREATE_GENERATED_RESOURCE_DETAILS_TOPIC: process.env.KAFKA_CREATE_GENERATED_RESOURCE_DETAILS_TOPIC || "create-generated-resource-details",
// Default hierarchy type
hierarchyType: "NITISH",

// Database configuration
DB_USER: process.env.DB_USER || "postgres",
DB_HOST: process.env.DB_HOST?.split(':')[0] || "localhost",
DB_NAME: process.env.DB_NAME || "postgres",
DB_PASSWORD: process.env.DB_PASSWORD || "postgres",
DB_PORT: process.env.DB_PORT || "5432",
DB_CONFIG: {
DB_USER: process.env.DB_USER || "postgres",
DB_HOST: process.env.DB_HOST?.split(':')[0] || "localhost",
DB_NAME: process.env.DB_NAME || "postgres",
DB_PASSWORD: process.env.DB_PASSWORD || "postgres",
DB_PORT: process.env.DB_PORT || "5432",
},
// Application configuration
app: {
port: parseInt(process.env.APP_PORT || "8080") || 8080,
ashish-egov marked this conversation as resolved.
Show resolved Hide resolved
Expand Down Expand Up @@ -107,25 +108,19 @@ const config = {
localizationSearch: process.env.EGOV_LOCALIZATION_SEARCH || "localization/messages/v1/_search",
localizationCreate: "localization/messages/v1/_upsert",
projectTypeSearch: "project-factory/v1/project-type/search",
boundaryRelationshipCreate:"boundary-service/boundary-relationships/_create"
boundaryRelationshipCreate: "boundary-service/boundary-relationships/_create"
},
// Values configuration
values: {
userMainBoundary: "mz",
userMainBoundaryType: "Country",
parsingTemplate: "HCM.ParsingTemplate",
transfromTemplate: "HCM.TransformTemplate",
campaignType: "HCM.HCMTemplate",
APIResource: "HCM.APIResourceTemplate3",
idgen: {
format: process.env.CMP_IDGEN_FORMAT || "CMP-[cy:yyyy-MM-dd]-[SEQ_EG_CMP_ID]",
idName: process.env.CMP_IDGEN_IDNAME || "campaign.number"
},
matchFacilityData: false,
retryCount: process.env.CREATE_RESOURCE_RETRY_COUNT || "3"
},
// Default search template
SEARCH_TEMPLATE: "HCM.APIResourceTemplate3"
}
};
// Exporting getErrorCodes function and config object
export { getErrorCodes };
Expand Down
65 changes: 1 addition & 64 deletions utilities/project-factory/src/server/utils/genericUtils.ts
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@ import config, { getErrorCodes } from "../config/index";
import { v4 as uuidv4 } from 'uuid';
import { produceModifiedMessages } from "../kafka/Listener";
import { generateHierarchyList, getAllFacilities, getHierarchy } from "../api/campaignApis";
import { searchMDMS, getCount, getBoundarySheetData, getSheetData, createAndUploadFile, createExcelSheet, getTargetSheetData, callMdmsData } from "../api/genericApis";
import { getBoundarySheetData, getSheetData, createAndUploadFile, createExcelSheet, getTargetSheetData, callMdmsData } from "../api/genericApis";
import * as XLSX from 'xlsx';
import FormData from 'form-data';
import { logger } from "./logger";
Expand All @@ -17,7 +17,6 @@ import { getLocaleFromRequest, getLocalisationModuleName } from "./localisationU
import { getBoundaryColumnName, getBoundaryTabName } from "./boundaryUtils";
import { getBoundaryDataService } from "../service/dataManageService";
const NodeCache = require("node-cache");
const _ = require('lodash');

const updateGeneratedResourceTopic = config.KAFKA_UPDATE_GENERATED_RESOURCE_DETAILS_TOPIC;
const createGeneratedResourceTopic = config.KAFKA_CREATE_GENERATED_RESOURCE_DETAILS_TOPIC;
Expand Down Expand Up @@ -366,67 +365,6 @@ async function getFinalUpdatedResponse(result: any, responseData: any, request:
});
}

async function callSearchApi(request: any, response: any) {
try {
let result: any;
const { type } = request.query;
result = await searchMDMS([type], config.SEARCH_TEMPLATE, request.body.RequestInfo, response);
const filter = request?.body?.Filters;
const requestBody = { "RequestInfo": request?.body?.RequestInfo, filter };
const responseData = result?.mdms?.[0]?.data;
if (!responseData || responseData.length === 0) {
return errorResponder({ message: "Invalid ApiResource Type. Check Logs" }, request, response);
}
const host = responseData?.host;
const url = responseData?.searchConfig?.url;
var queryParams: any = {};
for (const searchItem of responseData?.searchConfig?.searchBody) {
if (searchItem.isInParams) {
queryParams[searchItem.path] = searchItem.value;
}
else if (searchItem.isInBody) {
_.set(requestBody, `${searchItem.path}`, searchItem.value);
}
}
const countknown = responseData?.searchConfig?.isCountGiven === true;
let responseDatas: any[] = [];
const searchPath = responseData?.searchConfig?.keyName;
let fetchedData: any;
let responseObject: any;

if (countknown) {
const count = await getCount(responseData, request, response);
let noOfTimesToFetchApi = Math.ceil(count / queryParams.limit);
for (let i = 0; i < noOfTimesToFetchApi; i++) {
responseObject = await httpRequest(host + url, requestBody, queryParams, undefined, undefined, undefined);
fetchedData = _.get(responseObject, searchPath);
fetchedData.forEach((item: any) => {
responseDatas.push(item);
});
queryParams.offset = (parseInt(queryParams.offset) + parseInt(queryParams.limit)).toString();
}
}

else {
while (true) {
responseObject = await httpRequest(host + url, requestBody, queryParams, undefined, undefined, undefined);
fetchedData = _.get(responseObject, searchPath);
fetchedData.forEach((item: any) => {
responseDatas.push(item);
});
queryParams.offset = (parseInt(queryParams.offset) + parseInt(queryParams.limit)).toString();
if (fetchedData.length < parseInt(queryParams.limit)) {
break;
}
}
}
return responseDatas;
}
catch (e: any) {
logger.error(String(e))
return errorResponder({ message: String(e) + " Check Logs" }, request, response);
}
}


async function fullProcessFlowForNewEntry(newEntryResponse: any, generatedResource: any, request: any) {
Expand Down Expand Up @@ -1003,7 +941,6 @@ export {
generateAuditDetails,
generateActivityMessage,
getResponseFromDb,
callSearchApi,
getModifiedResponse,
getNewEntryResponse,
getOldEntryResponse,
Expand Down
Loading