From 7a8c71185c81dfc6a2ee9570cf1ad1a93a482f86 Mon Sep 17 00:00:00 2001 From: Ian Hopkinson Date: Fri, 31 May 2024 07:16:53 +0100 Subject: [PATCH 1/5] HDX-9874 Change references to HAPI to HDX HAPI --- docs/contact.md | 4 +- docs/data.md | 2 +- docs/geo.md | 2 +- docs/getting-started.md | 2 +- docs/index.md | 62 +++++++++---------- hdx_hapi/config/doc_snippets.py | 14 ++--- hdx_hapi/endpoints/get_admin_level.py | 12 ++-- hdx_hapi/endpoints/get_encoded_identifier.py | 2 +- hdx_hapi/endpoints/get_hdx_metadata.py | 14 ++--- .../endpoints/get_humanitarian_response.py | 16 ++--- main.py | 4 +- 11 files changed, 67 insertions(+), 67 deletions(-) diff --git a/docs/contact.md b/docs/contact.md index 9266603a..6daa0088 100644 --- a/docs/contact.md +++ b/docs/contact.md @@ -4,8 +4,8 @@ We appreciate your interest in our work. Here is how to contact us and get involved: -- [Bug reports and feature requests - TBD](fix/this/link) for HAPI are welcome. -- If you have questions or comments about HAPI or the HDX platform, send an email to [hdx@un.org](mailto:hdx@un.org). +- [Bug reports and feature requests - TBD](fix/this/link) for HDX HAPI are welcome. +- If you have questions or comments about HDX HAPI or the HDX platform, send an email to [hdx@un.org](mailto:hdx@un.org). - If you would like to be involved in periodic user research about HDX or related services, fill in [this form](https://docs.google.com/forms/d/e/1FAIpQLSdjN3mcDJ8BX-nu4F1veKEa8dYPlRvVcyahev8QjX7qHtha4g/viewform) and we will be in touch. - For general comments or project ideas, send an email to [centrehumdata@un.org](mailto:centrehumdata@un.org) and we will respond as soon as we can. - [Sign up](bit.ly/humdatamailing) to receive the Centre’s newsletter for updates on our work. Visit bit.ly/humdatamailing \ No newline at end of file diff --git a/docs/data.md b/docs/data.md index 1f4fd8c9..be5eda47 100644 --- a/docs/data.md +++ b/docs/data.md @@ -2,7 +2,7 @@ --- -The initial aim of HAPI is to cover all countries that have a humanitarian response plan and cover the data categories from HDX data grids +The initial aim of HDX HAPI is to cover all countries that have a humanitarian response plan and cover the data categories from HDX data grids | | Affected People - Humanitarian Needs | Affected People - Refugees | Coordination Context - Conflict Events | Coordination Context Funding | Coordination Context - National-risk | Coordination Context - Operational Presence | Food - Food-price | Food - Food-security | Population Social - Population | Population Social - Poverty Rate | |:----------------------------------:|:----------------------------------:|:-----------------------------------:|:----------------------------:|:----------------------------------:|:-----------------------------------------:|:---------------:|:------------------:|:----------------------------:|:------------------------------:|:------------------------------:| diff --git a/docs/geo.md b/docs/geo.md index 2f9a9019..e612f301 100644 --- a/docs/geo.md +++ b/docs/geo.md @@ -2,7 +2,7 @@ --- -Much of the data in HAPI references a geographical area. The complimentary geodata to this is provided by ITOS via ARCGIS service accessible here: [https://apps.itos.uga.edu/CODV2API/api/v1/](https://apps.itos.uga.edu/CODV2API/api/v1/) +Much of the data in HDX HAPI references a geographical area. The complimentary geodata to this is provided by ITOS via ARCGIS service accessible here: [https://apps.itos.uga.edu/CODV2API/api/v1/](https://apps.itos.uga.edu/CODV2API/api/v1/) This service contains the common operational datasets administration boundaries and it can be accessed in a number formats. Enhanced datasets have been standardised and contain the data formatting. Check the [CODs Dashboard](https://cod.unocha.org/) to see the status of different countries. diff --git a/docs/getting-started.md b/docs/getting-started.md index 29c3e07c..1ce637c6 100644 --- a/docs/getting-started.md +++ b/docs/getting-started.md @@ -9,7 +9,7 @@ Below, you will find example URLs to help you learn how to construct your API qu ## Generating a key -To access HAPI you need to generate an API key. This can be done via the the [sandbox interface encode_identifier endpoint](https://stage.hapi-humdata-org.ahconu.org/docs#/Utility/get_encoded_identifier_api_v1_encode_identifier_get). Enter your application name and email address and it will return the api key. The key must be included as a query string parameter e.g. +To access HDX HAPI you need to generate an API key. This can be done via the the [sandbox interface encode_identifier endpoint](https://stage.hapi-humdata-org.ahconu.org/docs#/Utility/get_encoded_identifier_api_v1_encode_identifier_get). Enter your application name and email address and it will return the api key. The key must be included as a query string parameter e.g. ``` diff --git a/docs/index.md b/docs/index.md index 20be3882..f7938cdb 100644 --- a/docs/index.md +++ b/docs/index.md @@ -3,14 +3,14 @@ --- -The HDX Humanitarian API (HAPI) is a way to access standardised indicators from multiple sources to automate workflows and visualisations +The HDX Humanitarian API (HDX HAPI) is a way to access standardised indicators from multiple sources to automate workflows and visualisations -HAPI is in beta phase, and we are seeking feedback. To share your thoughts or join our slack channel, send an email to [hdx@un.org](hdx@un.org). +HDX HAPI is in beta phase, and we are seeking feedback. To share your thoughts or join our slack channel, send an email to [hdx@un.org](hdx@un.org). -The initial scope of HAPI will be the data included in the [HDX Data Grids](https://data.humdata.org/dashboards/overview-of-data-grids). Work is ongoing to continually add more data +The initial scope of HDX HAPI will be the data included in the [HDX Data Grids](https://data.humdata.org/dashboards/overview-of-data-grids). Work is ongoing to continually add more data ## API Key -To access HAPI you need to generate an API key. This can be done via the the [sandbox interface encode_identifier endpoint](https://stage.hapi-humdata-org.ahconu.org/docs#/Utility/get_encoded_identifier_api_v1_encode_identifier_get). Enter your application name and email address and it will return the api key. The key must be included as a query string parameter e.g. +To access HDX HAPI you need to generate an API key. This can be done via the the [sandbox interface encode_identifier endpoint](https://stage.hapi-humdata-org.ahconu.org/docs#/Utility/get_encoded_identifier_api_v1_encode_identifier_get). Enter your application name and email address and it will return the api key. The key must be included as a query string parameter e.g. ``` https://stage.hapi-humdata-org.ahconu.org/api/v1/themes/3w?app_identifier={your api key} @@ -21,11 +21,11 @@ https://stage.hapi-humdata-org.ahconu.org/api/v1/themes/3w?app_identifier={your [The HDX Terms of Service](https://data.humdata.org/faqs/terms) -## The Structure of HAPI +## The Structure of HDX HAPI ### Data Subcategory Endpoints HAPI is organised around a set of key humanitarian data subcategories like **Baseline Population** and **3W - Operational Presence**. Each of these indicators can be queried via its endpoint. -### Current list of data subcategory endpoints in HAPI +### Current list of data subcategory endpoints in HDX HAPI #### Affected People - [Humanitarian Needs](https://stage.hapi-humdata-org.ahconu.org/docs#/Affected%20people/get_humanitarian_needs_api_v1_affected_people_humanitarian_needs_get) - [Refugees](https://stage.hapi-humdata-org.ahconu.org/docs#/Affected%20people/get_refugees_api_v1_affected_people_refugees_get) @@ -44,50 +44,50 @@ HAPI is organised around a set of key humanitarian data subcategories like **Bas ### Supporting Tables Additional supporting endpoints provide information about locations, codelists, and metadata. -### Current list of supporting endpoints in HAPI +### Current list of supporting endpoints in HDX HAPI - [Location](https://stage.hapi-humdata-org.ahconu.org/docs#/Locations%20and%20Administrative%20Divisions/get_locations_api_v1_metadata_location_get): Get the lists of locations (countries and similar), and administrative subdivisions used as location references in HAPI. These are taken from the [Common Operational Datasets](https://data.humdata.org/dashboards/cod) -- [admin1](https://stage.hapi-humdata-org.ahconu.org/docs#/Locations%20and%20Administrative%20Divisions/get_admin1_api_v1_metadata_admin1_get): Retrieve metadata about the source of any data available in HAPI. -- [admin2](https://stage.hapi-humdata-org.ahconu.org/docs#/Locations%20and%20Administrative%20Divisions/get_admin2_api_v1_metadata_admin2_get): Retrieve metadata about the source of any data available in HAPI. -- [hdx-metadata](https://placeholder.url/docs#/hdx-metadata): Retrieve metadata about the source of any data available in HAPI. +- [admin1](https://stage.hapi-humdata-org.ahconu.org/docs#/Locations%20and%20Administrative%20Divisions/get_admin1_api_v1_metadata_admin1_get): Retrieve metadata about the source of any data available in HDX HAPI. +- [admin2](https://stage.hapi-humdata-org.ahconu.org/docs#/Locations%20and%20Administrative%20Divisions/get_admin2_api_v1_metadata_admin2_get): Retrieve metadata about the source of any data available in HDX HAPI. +- [hdx-metadata](https://placeholder.url/docs#/hdx-metadata): Retrieve metadata about the source of any data available in HDX HAPI. ## FAQS ### What is an API ? An API, or Application Programming Interface, is a set of rules and tools that allows different software programs to communicate with each other. It enables developers to interact with external software components or resources efficiently, facilitating operations such as data retrieval, updates, and complex integrations. -### What is HAPI? -HAPI (the Humanitarian API) is an API designed to streamline access to key datasets related to humanitarian response. The API standardises data from a variety of sources and makes them consistently available. +### What is HDX HAPI? +HDX HAPI (the Humanitarian API) is an API designed to streamline access to key datasets related to humanitarian response. The API standardises data from a variety of sources and makes them consistently available. -### Who is HAPI for? -HAPI is designed for developers, researchers and anyone interested in accessing a centralised source of humanitarian data for analysis and decision-making. +### Who is HDX HAPI for? +HDX HAPI is designed for developers, researchers and anyone interested in accessing a centralised source of humanitarian data for analysis and decision-making. -### How do I access HAPI? -You can access HAPI through the API endpoints. Head to the HAPI documentation to get started. +### How do I access HDX HAPI? +You can access HDX HAPI through the API endpoints. Head to the HDX HAPI documentation to get started. -### Do I need an account to access HAPI? -You do not need an account to access HAPI, but you do need an access token which can be generated via the API. +### Do I need an account to access HDX HAPI? +You do not need an account to access HDX HAPI, but you do need an access token which can be generated via the API. -### What time period does the data in the current version of HAPI cover? -The time period covered by the data in the beta version of HAPI varies depending on the resource. Please see the [detailed documentation] for more details. All data contains a reference period. Our goal is to consistently integrate the most up-to-date data from HDX into HAPI. +### What time period does the data in the current version of HDX HAPI cover? +The time period covered by the data in the beta version of HDX HAPI varies depending on the resource. Please see the [detailed documentation] for more details. All data contains a reference period. Our goal is to consistently integrate the most up-to-date data from HDX into HDX HAPI. ### How have key datasets been selected? -Key datasets in the beta-phase HAPI have been selected based on their usage and relevance to pressing humanitarian needs. HAPI aims to incorporate the data in the HDX Data Grids, covering countries with a Humanitarian Response Plan. +Key datasets in the beta-phase HDX HAPI have been selected based on their usage and relevance to pressing humanitarian needs. HDX HAPI aims to incorporate the data in the HDX Data Grids, covering countries with a Humanitarian Response Plan. ### What is a sub-category? -### How up-to-date is the data in HAPI? -HAPI is updated from source data daily. Each dataset’s update frequency varies from daily, weekly, yearly and as needed. Please check the source dataset for further detail. +### How up-to-date is the data in HDX HAPI? +HDX HAPI is updated from source data daily. Each dataset’s update frequency varies from daily, weekly, yearly and as needed. Please check the source dataset for further detail. -### Is the data in HAPI different from the data I can download from HDX? -The data in HAPI is from selected datasets from HDX. Some of the data will have been standardised, such as aligning sector names. +### Is the data in HDX HAPI different from the data I can download from HDX? +The data in HDX HAPI is from selected datasets from HDX. Some of the data will have been standardised, such as aligning sector names. -In the coming months, the standardised datasets that HAPI produces will be added to the source datasets on HDX as downloadable CSV files for easy use in spreadsheet applications. +In the coming months, the standardised datasets that HDX HAPI produces will be added to the source datasets on HDX as downloadable CSV files for easy use in spreadsheet applications. -### How is HAPI different from the HDX CKAN API? -The HDX CKAN API provides programmatic access to metadata from HDX. HAPI provides queryable access to the data values themselves. +### How is HDX HAPI different from the HDX CKAN API? +The HDX CKAN API provides programmatic access to metadata from HDX. HDX HAPI provides queryable access to the data values themselves. -### Why would I use HAPI instead of other organisations’ APIs? -HAPI brings together a core set of humanitarian data in one place, with standardised references. HAPI integrates APIs of other organisations and pulls through non-API data. +### Why would I use HDX HAPI instead of other organisations’ APIs? +HDX HAPI brings together a core set of humanitarian data in one place, with standardised references. HDX HAPI integrates APIs of other organisations and pulls through non-API data. -### How do I give feedback for HAPI? +### How do I give feedback for HDX HAPI? Please send all feedback to hdx@un.org. diff --git a/hdx_hapi/config/doc_snippets.py b/hdx_hapi/config/doc_snippets.py index 0d287838..1c191dbc 100644 --- a/hdx_hapi/config/doc_snippets.py +++ b/hdx_hapi/config/doc_snippets.py @@ -24,19 +24,19 @@ ) DOC_LOCATION_CODE = 'Filter the response by a location (typically a country). The location codes use the ISO-3 (ISO 3166 alpha-3) codes.' DOC_LOCATION_NAME = 'Filter the response by a location (typically a country). The location names are based on the "short name" from the UN M49 Standard.' -DOC_ORG_ACRONYM = 'Filter the response by the standard acronym used to represent the organization. When data is brought into the HAPI database, an attempt is made to standardize the acronyms.' -DOC_ORG_NAME = 'Filter the response by the standard name used to represent the organization. When data is brought into the HAPI database, an attempt is made to standardize the acronyms.' +DOC_ORG_ACRONYM = 'Filter the response by the standard acronym used to represent the organization. When data is brought into the HDX HAPI database, an attempt is made to standardize the acronyms.' +DOC_ORG_NAME = 'Filter the response by the standard name used to represent the organization. When data is brought into the HDX HAPI database, an attempt is made to standardize the acronyms.' DOC_ORG_TYPE_CODE = 'Filter the response by the organization type code.' DOC_ORG_TYPE_DESCRIPTION = 'Filter the response by the organization type description.' -DOC_SCOPE_DISCLAIMER = f'Not all data are available for all locations. Learn more about the scope of data coverage in HAPI in the Overview and Getting Started documentation.' +DOC_SCOPE_DISCLAIMER = f'Not all data are available for all locations. Learn more about the scope of data coverage in HDX HAPI in the Overview and Getting Started documentation.' DOC_SECTOR_CODE = 'Filter the response by the sector code.' DOC_SECTOR_NAME = 'Filter the response by the sector name.' DOC_UPDATE_DATE_MIN = 'Min date of update date, e.g. 2020-01-01 or 2020-01-01T00:00:00' DOC_UPDATE_DATE_MAX = 'Max date of update date, e.g. 2020-01-01 or 2020-01-01T00:00:00' -DOC_HAPI_UPDATED_DATE_MIN = 'Min date of HAPI updated date, e.g. 2020-01-01 or 2020-01-01T00:00:00' -DOC_HAPI_UPDATED_DATE_MAX = 'Max date of HAPI updated date, e.g. 2020-01-01 or 2020-01-01T00:00:00' -DOC_HAPI_REPLACED_DATE_MIN = 'Min date of HAPI replaced date, e.g. 2020-01-01 or 2020-01-01T00:00:00' -DOC_HAPI_REPLACED_DATE_MAX = 'Max date of HAPI replaced date, e.g. 2020-01-01 or 2020-01-01T00:00:00' +DOC_HAPI_UPDATED_DATE_MIN = 'Min date of HDX HAPI updated date, e.g. 2020-01-01 or 2020-01-01T00:00:00' +DOC_HAPI_UPDATED_DATE_MAX = 'Max date of HDX HAPI updated date, e.g. 2020-01-01 or 2020-01-01T00:00:00' +DOC_HAPI_REPLACED_DATE_MIN = 'Min date of HDX HAPI replaced date, e.g. 2020-01-01 or 2020-01-01T00:00:00' +DOC_HAPI_REPLACED_DATE_MAX = 'Max date of HDX HAPI replaced date, e.g. 2020-01-01 or 2020-01-01T00:00:00' DOC_SEE_ADMIN1 = 'See the admin1 endpoint for details.' DOC_SEE_ADMIN2 = 'See the admin2 endpoint for details.' diff --git a/hdx_hapi/endpoints/get_admin_level.py b/hdx_hapi/endpoints/get_admin_level.py index a0087cf2..54c97e29 100644 --- a/hdx_hapi/endpoints/get_admin_level.py +++ b/hdx_hapi/endpoints/get_admin_level.py @@ -39,13 +39,13 @@ @router.get( '/api/metadata/location', response_model=HapiGenericResponse[LocationResponse], - summary='Get the list of locations (typically countries) included in HAPI', + summary='Get the list of locations (typically countries) included in HDX HAPI', include_in_schema=False, ) @router.get( '/api/v1/metadata/location', response_model=HapiGenericResponse[LocationResponse], - summary='Get the list of locations (typically countries) included in HAPI', + summary='Get the list of locations (typically countries) included in HDX HAPI', ) async def get_locations( # ref_period_parameters: Annotated[ReferencePeriodParameters, Depends(reference_period_parameters)], @@ -67,7 +67,7 @@ async def get_locations( get_locations.__doc__ = ( - 'Not all data are available for all locations. Learn more about the scope of data coverage in HAPI in ' + 'Not all data are available for all locations. Learn more about the scope of data coverage in HDX HAPI in ' f'the Overview and Getting Started documentation.' ) @@ -123,7 +123,7 @@ async def get_admin1( get_admin1.__doc__ = ( - 'Not all data are available for all locations. Learn more about the scope of data coverage in HAPI in ' + 'Not all data are available for all locations. Learn more about the scope of data coverage in HDX HAPI in ' f'the Overview and Getting Started documentation.' ) @@ -131,13 +131,13 @@ async def get_admin1( @router.get( '/api/metadata/admin2', response_model=HapiGenericResponse[Admin2Response], - summary='Get the list of second-level administrative divisions available in HAPI', + summary='Get the list of second-level administrative divisions available in HDX HAPI', include_in_schema=False, ) @router.get( '/api/v1/metadata/admin2', response_model=HapiGenericResponse[Admin2Response], - summary='Get the list of second-level administrative divisions available in HAPI', + summary='Get the list of second-level administrative divisions available in HDX HAPI', ) async def get_admin2( # ref_period_parameters: Annotated[ReferencePeriodParameters, Depends(reference_period_parameters)], diff --git a/hdx_hapi/endpoints/get_encoded_identifier.py b/hdx_hapi/endpoints/get_encoded_identifier.py index 26fc2537..d18764cc 100644 --- a/hdx_hapi/endpoints/get_encoded_identifier.py +++ b/hdx_hapi/endpoints/get_encoded_identifier.py @@ -29,7 +29,7 @@ async def get_encoded_identifier( email: Annotated[EmailStr, email_identifier_query], ): """ - Encode an application name and email address in base64 to serve as an client identifier in HAPI calls + Encode an application name and email address in base64 to serve as an client identifier in HDX HAPI calls """ encoded_identifier = base64.b64encode(bytes(f'{application}:{email}', 'utf-8')) diff --git a/hdx_hapi/endpoints/get_hdx_metadata.py b/hdx_hapi/endpoints/get_hdx_metadata.py index 85249117..8a007bc2 100644 --- a/hdx_hapi/endpoints/get_hdx_metadata.py +++ b/hdx_hapi/endpoints/get_hdx_metadata.py @@ -40,13 +40,13 @@ @router.get( '/api/metadata/dataset', response_model=HapiGenericResponse[DatasetResponse], - summary='Get information about the sources of the data in HAPI', + summary='Get information about the sources of the data in HDX HAPI', include_in_schema=False, ) @router.get( '/api/v1/metadata/dataset', response_model=HapiGenericResponse[DatasetResponse], - summary='Get information about the sources of the data in HAPI', + summary='Get information about the sources of the data in HDX HAPI', ) async def get_datasets( common_parameters: Annotated[CommonEndpointParams, Depends(common_endpoint_parameters)], @@ -60,7 +60,7 @@ async def get_datasets( ): """ Get information about the HDX Datasets that are used as data sources - for HAPI. Datasets contain one or more resources, which are the sources of the data found in HAPI. + for HDX HAPI. Datasets contain one or more resources, which are the sources of the data found in HDX HAPI. """ result = await get_datasets_srv( pagination_parameters=common_parameters, @@ -77,13 +77,13 @@ async def get_datasets( @router.get( '/api/metadata/resource', response_model=HapiGenericResponse[ResourceResponse], - summary='Get information about the sources of the data in HAPI', + summary='Get information about the sources of the data in HDX HAPI', include_in_schema=False, ) @router.get( '/api/v1/metadata/resource', response_model=HapiGenericResponse[ResourceResponse], - summary='Get information about the sources of the data in HAPI', + summary='Get information about the sources of the data in HDX HAPI', ) async def get_resources( common_parameters: Annotated[CommonEndpointParams, Depends(common_endpoint_parameters)], @@ -111,8 +111,8 @@ async def get_resources( output_format: OutputFormat = OutputFormat.JSON, ): """ - Get information about the resources that are used as data sources for HAPI. Datasets contain one or more resources, - which are the sources of the data found in HAPI. + Get information about the resources that are used as data sources for HDX HAPI. Datasets contain one or + more resources, which are the sources of the data found in HDX HAPI. """ result = await get_resources_srv( pagination_parameters=common_parameters, diff --git a/hdx_hapi/endpoints/get_humanitarian_response.py b/hdx_hapi/endpoints/get_humanitarian_response.py index 62fe2d39..841e0ef6 100644 --- a/hdx_hapi/endpoints/get_humanitarian_response.py +++ b/hdx_hapi/endpoints/get_humanitarian_response.py @@ -36,13 +36,13 @@ @router.get( '/api/metadata/org', response_model=HapiGenericResponse[OrgResponse], - summary='Get the list of organizations represented in the data available in HAPI', + summary='Get the list of organizations represented in the data available in HDX HAPI', include_in_schema=False, ) @router.get( '/api/v1/metadata/org', response_model=HapiGenericResponse[OrgResponse], - summary='Get the list of organizations represented in the data available in HAPI', + summary='Get the list of organizations represented in the data available in HDX HAPI', ) async def get_orgs( common_parameters: Annotated[CommonEndpointParams, Depends(common_endpoint_parameters)], @@ -83,13 +83,13 @@ async def get_orgs( @router.get( '/api/metadata/org_type', response_model=HapiGenericResponse[OrgTypeResponse], - summary='Get information about how organizations are classified in HAPI', + summary='Get information about how organizations are classified in HDX HAPI', include_in_schema=False, ) @router.get( '/api/v1/metadata/org-type', response_model=HapiGenericResponse[OrgTypeResponse], - summary='Get information about how organizations are classified in HAPI', + summary='Get information about how organizations are classified in HDX HAPI', ) async def get_org_types( common_parameters: Annotated[CommonEndpointParams, Depends(common_endpoint_parameters)], @@ -105,8 +105,8 @@ async def get_org_types( ] = None, output_format: OutputFormat = OutputFormat.JSON, ): - """There is no agreed standard for the classification of organizations. The codes and descriptions used in HAPI are - based on this dataset. + """There is no agreed standard for the classification of organizations. The codes and descriptions used in HDX HAPI + are based on this dataset. """ result = await get_org_types_srv(pagination_parameters=common_parameters, db=db, code=code, description=description) return transform_result_to_csv_stream_if_requested(result, output_format, OrgTypeResponse) @@ -134,8 +134,8 @@ async def get_sectors( ] = None, output_format: OutputFormat = OutputFormat.JSON, ): - """There is no consistent standard for the humanitarian sectors. The codes and descriptions used in HAPI are based - on this dataset. + """There is no consistent standard for the humanitarian sectors. The codes and descriptions used in HDX HAPI are + based on this dataset. """ result = await get_sectors_srv( pagination_parameters=common_parameters, diff --git a/main.py b/main.py index 314ec40c..194b8604 100644 --- a/main.py +++ b/main.py @@ -44,8 +44,8 @@ CONFIG = get_config() app = FastAPI( - title='HAPI', - description='The Humanitarian API (HAPI) is a service of the Humanitarian Data Exchange (HDX), part of UNOCHA\'s Centre for Humanitarian Data.\nThis is the reference documentation of the API. You may want to get started here', # noqa + title='HDX HAPI', + description='The Humanitarian API (HDX HAPI) is a service of the Humanitarian Data Exchange (HDX), part of UNOCHA\'s Centre for Humanitarian Data.\nThis is the reference documentation of the API. You may want to get started here', # noqa version='0.1.0', docs_url=None, servers=[{'url': CONFIG.HAPI_SERVER_URL}] if CONFIG.HAPI_SERVER_URL else [], From 97f380a18d582906e978326e449ad7d3e65d7764 Mon Sep 17 00:00:00 2001 From: Ian Hopkinson Date: Fri, 31 May 2024 07:27:17 +0100 Subject: [PATCH 2/5] HDX-9874 Update to mkdocs.yml which may need to be reverted --- mkdocs.yml | 2 +- 1 file changed, 1 insertion(+), 1 deletion(-) diff --git a/mkdocs.yml b/mkdocs.yml index 495668ba..d8f5e35c 100644 --- a/mkdocs.yml +++ b/mkdocs.yml @@ -1,4 +1,4 @@ -site_name: HAPI - The Humanitarian API +site_name: HDX HAPI - The Humanitarian API nav: - Home: index.md - Getting Started: getting-started.md From b3a063fbc1cd1e1aa7e3d4b4719b0262150bd327 Mon Sep 17 00:00:00 2001 From: Ian Hopkinson Date: Fri, 31 May 2024 09:33:24 +0100 Subject: [PATCH 3/5] HDX-9869 Clarify dataset endpoint text --- CONTRIBUTING.md | 8 ++++---- hdx_hapi/config/doc_snippets.py | 8 ++++---- 2 files changed, 8 insertions(+), 8 deletions(-) diff --git a/CONTRIBUTING.md b/CONTRIBUTING.md index f2332707..6dd37943 100644 --- a/CONTRIBUTING.md +++ b/CONTRIBUTING.md @@ -1,6 +1,6 @@ # Contributing -The Humanitarian API (HAPI) is being developed by a team from the [Centre for Humanitarian Data](https://centre.humdata.org/). +The Humanitarian API (HDX HAPI) is being developed by a team from the [Centre for Humanitarian Data](https://centre.humdata.org/). HDX developers are using [Visual Code](https://code.visualstudio.com/) as a standard IDE for this project with development taking place inside Docker containers. @@ -26,15 +26,15 @@ docker-compose exec -T hapi sh -c "pytest --log-level=INFO --cov=. --cov-report As an integration test the `docs` endpoint is inspected "manually". -A local copy of HAPI can be run by importing a snapshot of the database using the following shell script invocation in the host machine. +A local copy of HDX HAPI can be run by importing a snapshot of the database using the following shell script invocation in the host machine. ```shell ./restore_database.sh https://github.com/OCHA-DAP/hapi-pipelines/raw/db-export/database/hapi_db.pg_restore hapi ``` -The HAPI application can then be launched using the `start` launch configuration in Visual Code, this serves the documentation at `http://localhost:8844/docs` and the API at `http://localhost:8844/api` in the host machine. +The HDX HAPI application can then be launched using the `start` launch configuration in Visual Code, this serves the documentation at `http://localhost:8844/docs` and the API at `http://localhost:8844/api` in the host machine. -The HAPI database can be accessed locally with the following connection details: +The HDX HAPI database can be accessed locally with the following connection details: ``` URL: jdbc:postgresql://localhost:45432/hapi diff --git a/hdx_hapi/config/doc_snippets.py b/hdx_hapi/config/doc_snippets.py index 1c191dbc..a28b8123 100644 --- a/hdx_hapi/config/doc_snippets.py +++ b/hdx_hapi/config/doc_snippets.py @@ -12,11 +12,11 @@ DOC_GENDER_CODE = 'Filter the response by the gender code.' DOC_GENDER = 'Filter the response by the gender.' DOC_GENDER_DESCRIPTION = 'Filter the response by the gender description.' -DOC_HDX_DATASET_ID = 'Filter the response by the dataset ID, which is a unique and fixed identifier of a Dataset on HDX. A URL in the pattern of `https://data.humdata.org/dataset/[dataset id]` will load the dataset page on HDX.' -DOC_HDX_DATASET_NAME = 'Filter the response by the URL-safe name of the dataset as displayed on HDX. This name is unique but can change. A URL in the pattern of `https://data.humdata.org/dataset/[dataset name]` will load the dataset page on HDX.' +DOC_HDX_DATASET_ID = 'Filter the response by the dataset ID (hdx_id), which is a unique and fixed identifier of a Dataset on HDX. A URL in the pattern of `https://data.humdata.org/dataset/[hdx_id]` will load the dataset page on HDX.' +DOC_HDX_DATASET_NAME = 'Filter the response by the URL-safe name (hdx_stub) of the dataset as displayed on HDX. This name is unique but can change. A URL in the pattern of `https://data.humdata.org/dataset/[hdx_stub]` will load the dataset page on HDX.' DOC_HDX_DATASET_TITLE = 'Filter the response by the title of the dataset as it appears in the HDX interface. This name is not unique and can change.' -DOC_HDX_PROVIDER_STUB = "Filter the response by the code of the provider of the dataset on HDX. A URL in the pattern of `https://data.humdata.org/organization/[org stub]` will load the provider's page on HDX." -DOC_HDX_PROVIDER_NAME = 'Filter the response by the display name of the provider of the dataset on HDX.' +DOC_HDX_PROVIDER_STUB = "Filter the response by the code of the provider (organization) of the dataset on HDX. A URL in the pattern of `https://data.humdata.org/organization/[hdx_provider_stub]` will load the provider's page on HDX." +DOC_HDX_PROVIDER_NAME = 'Filter the response by the display name of the provider (organization) of the dataset on HDX.' DOC_HDX_RESOURCE_ID = 'Filter the response by the resource ID, which is a unique and fixed identifier of a Dataset on HDX. A URL in the pattern of `https://data.humdata.org/dataset/[dataset id]/resource/[resource id]` will load the dataset page on HDX.' DOC_HDX_RESOURCE_FORMAT = 'Filter the response by the format of the resource on HDX. These are typically file formats, but can also include APIs and web apps.' DOC_HDX_RESOURCE_HXL = ( From 62a042bdca21dd5cd194683c11c13b5287266119 Mon Sep 17 00:00:00 2001 From: Ian Hopkinson Date: Fri, 31 May 2024 09:52:15 +0100 Subject: [PATCH 4/5] HDX-9868 Update overview text in docs UI --- main.py | 13 ++++++++++++- 1 file changed, 12 insertions(+), 1 deletion(-) diff --git a/main.py b/main.py index 194b8604..a5fd76c5 100644 --- a/main.py +++ b/main.py @@ -42,10 +42,21 @@ logger = logging.getLogger(__name__) CONFIG = get_config() +DESCRIPTION = """ +The Humanitarian API (HDX HAPI) is a service of the Humanitarian Data Exchange (HDX), part of UNOCHA\'s Centre for Humanitarian Data.\nThis is the reference documentation of the API. You may want to get started here + +All queries require an `app_identifier` which can be supplied as a query parameter or as a header (`X-HDX-HAPI-APP-IDENTIFIER`). The `app_identifier` is simply a base64 encoded version of a user supplied application name and email address. + +The `limit` and `offset` parameters are available for all queries and have the usual database meanings to provide pagination of results. + +The `output_format` parameter is available for all queries and can be set to JSON or csv, where JSON is selected rows of data are supplied under a data key. + +Query parameters that access string fields are implicitly wildcards and case insensitive so that `location_name=Mali` will return data for Mali and Somalia. +""" app = FastAPI( title='HDX HAPI', - description='The Humanitarian API (HDX HAPI) is a service of the Humanitarian Data Exchange (HDX), part of UNOCHA\'s Centre for Humanitarian Data.\nThis is the reference documentation of the API. You may want to get started here', # noqa + description=DESCRIPTION, # noqa version='0.1.0', docs_url=None, servers=[{'url': CONFIG.HAPI_SERVER_URL}] if CONFIG.HAPI_SERVER_URL else [], From 49f3f4b3661c34a852c3dc19aaf9b172538938d7 Mon Sep 17 00:00:00 2001 From: Ian Hopkinson Date: Fri, 31 May 2024 10:00:35 +0100 Subject: [PATCH 5/5] HDX-9868 Linter fix --- main.py | 23 ++++++++++++++++------- 1 file changed, 16 insertions(+), 7 deletions(-) diff --git a/main.py b/main.py index a5fd76c5..a5ddf9a5 100644 --- a/main.py +++ b/main.py @@ -43,20 +43,29 @@ CONFIG = get_config() DESCRIPTION = """ -The Humanitarian API (HDX HAPI) is a service of the Humanitarian Data Exchange (HDX), part of UNOCHA\'s Centre for Humanitarian Data.\nThis is the reference documentation of the API. You may want to get started here +The Humanitarian API (HDX HAPI) is a service of the +Humanitarian Data Exchange (HDX), part of UNOCHA\'s +Centre for Humanitarian Data. +\nThis is the reference documentation of the API. +You may want to get started here -All queries require an `app_identifier` which can be supplied as a query parameter or as a header (`X-HDX-HAPI-APP-IDENTIFIER`). The `app_identifier` is simply a base64 encoded version of a user supplied application name and email address. +All queries require an `app_identifier` which can be supplied as a query parameter or as a header +(`X-HDX-HAPI-APP-IDENTIFIER`). The `app_identifier` is simply a base64 encoded version of a user supplied +application name and email address. -The `limit` and `offset` parameters are available for all queries and have the usual database meanings to provide pagination of results. +The `limit` and `offset` parameters are available for all queries and have the usual database meanings +to provide pagination of results. -The `output_format` parameter is available for all queries and can be set to JSON or csv, where JSON is selected rows of data are supplied under a data key. +The `output_format` parameter is available for all queries and can be set to JSON or csv, +where JSON is selected rows of data are supplied under a data key. -Query parameters that access string fields are implicitly wildcards and case insensitive so that `location_name=Mali` will return data for Mali and Somalia. -""" +Query parameters that access string fields are implicitly wildcards and case insensitive +so that `location_name=Mali` will return data for Mali and Somalia. +""" # noqa app = FastAPI( title='HDX HAPI', - description=DESCRIPTION, # noqa + description=DESCRIPTION, version='0.1.0', docs_url=None, servers=[{'url': CONFIG.HAPI_SERVER_URL}] if CONFIG.HAPI_SERVER_URL else [],