diff --git a/Makefile b/Makefile index 7119bad856..7851c1f1c5 100644 --- a/Makefile +++ b/Makefile @@ -86,14 +86,14 @@ test-python-unit: python -m pytest -n 8 --color=yes sdk/python/tests test-python-integration: - python -m pytest -n 8 --integration --color=yes --durations=10 --timeout=1200 --timeout_method=thread \ + python -m pytest -n 4 --integration --color=yes --durations=10 --timeout=1200 --timeout_method=thread \ -k "(not snowflake or not test_historical_features_main)" \ sdk/python/tests test-python-integration-local: FEAST_IS_LOCAL_TEST=True \ FEAST_LOCAL_ONLINE_CONTAINER=True \ - python -m pytest -n 8 --color=yes --integration --durations=5 --dist loadgroup \ + python -m pytest -n 4 --color=yes --integration --durations=10 --timeout=1200 --timeout_method=thread --dist loadgroup \ -k "not test_lambda_materialization and not test_snowflake_materialization" \ sdk/python/tests diff --git a/docs/SUMMARY.md b/docs/SUMMARY.md index 0060ae729e..bba4929749 100644 --- a/docs/SUMMARY.md +++ b/docs/SUMMARY.md @@ -16,6 +16,7 @@ * [Feature retrieval](getting-started/concepts/feature-retrieval.md) * [Point-in-time joins](getting-started/concepts/point-in-time-joins.md) * [Registry](getting-started/concepts/registry.md) + * [Role-Based Access Control (RBAC)](getting-started/architecture/rbac.md) * [\[Alpha\] Saved dataset](getting-started/concepts/dataset.md) * [Architecture](getting-started/architecture/README.md) * [Overview](getting-started/architecture/overview.md) diff --git a/docs/getting-started/architecture/README.md b/docs/getting-started/architecture/README.md index f824164339..030bc62f4c 100644 --- a/docs/getting-started/architecture/README.md +++ b/docs/getting-started/architecture/README.md @@ -23,3 +23,7 @@ {% content-ref url="model-inference.md" %} [model-inference.md](model-inference.md) {% endcontent-ref %} + +{% content-ref url="rbac.md" %} +[rbac.md](rbac.md) +{% endcontent-ref %} \ No newline at end of file diff --git a/docs/getting-started/architecture/overview.md b/docs/getting-started/architecture/overview.md index 44fa5ac260..86ee75aaa6 100644 --- a/docs/getting-started/architecture/overview.md +++ b/docs/getting-started/architecture/overview.md @@ -17,3 +17,7 @@ typically your Offline Store). We are exploring adding a default streaming engin write patterns](write-patterns.md) to your application * We recommend [using Python](language.md) for your Feature Store microservice. As mentioned in the document, precomputing features is the recommended optimal path to ensure low latency performance. Reducing feature serving to a lightweight database lookup is the ideal pattern, which means the marginal overhead of Python should be tolerable. Because of this we believe the pros of Python outweigh the costs, as reimplementing feature logic is undesirable. Java and Go Clients are also available for online feature retrieval. + +* [Role-Based Access Control (RBAC)](rbac.md) is a security mechanism that restricts access to resources based on the roles of individual users within an organization. In the context of the Feast, RBAC ensures that only authorized users or groups can access or modify specific resources, thereby maintaining data security and operational integrity. + + diff --git a/docs/getting-started/architecture/rbac.jpg b/docs/getting-started/architecture/rbac.jpg new file mode 100644 index 0000000000..0de87d1718 Binary files /dev/null and b/docs/getting-started/architecture/rbac.jpg differ diff --git a/docs/getting-started/architecture/rbac.md b/docs/getting-started/architecture/rbac.md new file mode 100644 index 0000000000..9a51fba6ac --- /dev/null +++ b/docs/getting-started/architecture/rbac.md @@ -0,0 +1,56 @@ +# Role-Based Access Control (RBAC) in Feast + +## Introduction + +Role-Based Access Control (RBAC) is a security mechanism that restricts access to resources based on the roles of individual users within an organization. In the context of the Feast, RBAC ensures that only authorized users or groups can access or modify specific resources, thereby maintaining data security and operational integrity. + +## Functional Requirements + +The RBAC implementation in Feast is designed to: + +- **Assign Permissions**: Allow administrators to assign permissions for various operations and resources to users or groups based on their roles. +- **Seamless Integration**: Integrate smoothly with existing business code without requiring significant modifications. +- **Backward Compatibility**: Maintain support for non-authorized models as the default to ensure backward compatibility. + +## Business Goals + +The primary business goals of implementing RBAC in the Feast are: + +1. **Feature Sharing**: Enable multiple teams to share the feature store while ensuring controlled access. This allows for collaborative work without compromising data security. +2. **Access Control Management**: Prevent unauthorized access to team-specific resources and spaces, governing the operations that each user or group can perform. + +## Reference Architecture + +Feast operates as a collection of connected services, each enforcing authorization permissions. The architecture is designed as a distributed microservices system with the following key components: + +- **Service Endpoints**: These enforce authorization permissions, ensuring that only authorized requests are processed. +- **Client Integration**: Clients authenticate with feature servers by attaching authorization token to each request. +- **Service-to-Service Communication**: This is always granted. + +![rbac.jpg](rbac.jpg) + +## Permission Model + +The RBAC system in Feast uses a permission model that defines the following concepts: + +- **Resource**: An object within Feast that needs to be secured against unauthorized access. +- **Action**: A logical operation performed on a resource, such as Create, Describe, Update, Delete, Read, or write operations. +- **Policy**: A set of rules that enforce authorization decisions on resources. The default implementation uses role-based policies. + + + +## Authorization Architecture + +The authorization architecture in Feast is built with the following components: + +- **Token Extractor**: Extracts the authorization token from the request header. +- **Token Parser**: Parses the token to retrieve user details. +- **Policy Enforcer**: Validates the secured endpoint against the retrieved user details. +- **Token Injector**: Adds the authorization token to each secured request header. + + + + + + + diff --git a/docs/getting-started/components/README.md b/docs/getting-started/components/README.md index d468714bd4..e1c000abce 100644 --- a/docs/getting-started/components/README.md +++ b/docs/getting-started/components/README.md @@ -19,3 +19,7 @@ {% content-ref url="provider.md" %} [provider.md](provider.md) {% endcontent-ref %} + +{% content-ref url="authz_manager.md" %} +[authz_manager.md](authz_manager.md) +{% endcontent-ref %} diff --git a/docs/getting-started/components/authz_manager.md b/docs/getting-started/components/authz_manager.md new file mode 100644 index 0000000000..09ca4d1366 --- /dev/null +++ b/docs/getting-started/components/authz_manager.md @@ -0,0 +1,102 @@ +# Authorization Manager +An Authorization Manager is an instance of the `AuthManager` class that is plugged into one of the Feast servers to extract user details from the current request and inject them into the [permissions](../../getting-started/concepts/permissions.md) framework. + +{% hint style="info" %} +**Note**: Feast does not provide authentication capabilities; it is the client's responsibility to manage the authentication token and pass it to +the Feast server, which then validates the token and extracts user details from the configured authentication server. +{% endhint %} + +Two authorization managers are supported out-of-the-box: +* One using a configurable OIDC server to extract the user details. +* One using the Kubernetes RBAC resources to extract the user details. + +These instances are created when the Feast servers are initialized, according to the authorization configuration defined in +their own `feature_store.yaml`. + +Feast servers and clients must have consistent authorization configuration, so that the client proxies can automatically inject +the authorization tokens that the server can properly identify and use to enforce permission validations. + + +## Design notes +The server-side implementation of the authorization functionality is defined [here](./../../../sdk/python/feast/permissions/server). +Few of the key models, classes to understand the authorization implementation on the client side can be found [here](./../../../sdk/python/feast/permissions/client). + +## Configuring Authorization +The authorization is configured using a dedicated `auth` section in the `feature_store.yaml` configuration. + +**Note**: As a consequence, when deploying the Feast servers with the Helm [charts](../../../infra/charts/feast-feature-server/README.md), +the `feature_store_yaml_base64` value must include the `auth` section to specify the authorization configuration. + +### No Authorization +This configuration applies the default `no_auth` authorization: +```yaml +project: my-project +auth: + type: no_auth +... +``` + +### OIDC Authorization +With OIDC authorization, the Feast client proxies retrieve the JWT token from an OIDC server (or [Identity Provider](https://openid.net/developers/how-connect-works/)) +and append it in every request to a Feast server, using an [Authorization Bearer Token](https://developer.mozilla.org/en-US/docs/Web/HTTP/Authentication#bearer). + +The server, in turn, uses the same OIDC server to validate the token and extract the user roles from the token itself. + +Some assumptions are made in the OIDC server configuration: +* The OIDC token refers to a client with roles matching the RBAC roles of the configured `Permission`s (*) +* The roles are exposed in the access token passed to the server + +(*) Please note that **the role match is case-sensitive**, e.g. the name of the role in the OIDC server and in the `Permission` configuration +must be exactly the same. + +For example, the access token for a client `app` of a user with `reader` role should have the following `resource_access` section: +```json +{ + "resource_access": { + "app": { + "roles": [ + "reader" + ] + }, +} +``` + +An example of OIDC authorization configuration is the following: +```yaml +project: my-project +auth: + type: oidc + client_id: _CLIENT_ID__ + client_secret: _CLIENT_SECRET__ + realm: _REALM__ + auth_server_url: _OIDC_SERVER_URL_ + auth_discovery_url: _OIDC_SERVER_URL_/realms/master/.well-known/openid-configuration +... +``` + +In case of client configuration, the following settings must be added to specify the current user: +```yaml +auth: + ... + username: _USERNAME_ + password: _PASSWORD_ +``` + +### Kubernetes RBAC Authorization +With Kubernetes RBAC Authorization, the client uses the service account token as the authorizarion bearer token, and the +server fetches the associated roles from the Kubernetes RBAC resources. + +An example of Kubernetes RBAC authorization configuration is the following: +{% hint style="info" %} +**NOTE**: This configuration will only work if you deploy feast on Openshift or a Kubernetes platform. +{% endhint %} +```yaml +project: my-project +auth: + type: kubernetes +... +``` + +In case the client cannot run on the same cluster as the servers, the client token can be injected using the `LOCAL_K8S_TOKEN` +environment variable on the client side. The value must refer to the token of a service account created on the servers cluster +and linked to the desired RBAC roles. \ No newline at end of file diff --git a/docs/getting-started/components/overview.md b/docs/getting-started/components/overview.md index 393f436e5b..0ee3835de6 100644 --- a/docs/getting-started/components/overview.md +++ b/docs/getting-started/components/overview.md @@ -28,3 +28,4 @@ A complete Feast deployment contains the following components: * **Batch Materialization Engine:** The [Batch Materialization Engine](batch-materialization-engine.md) component launches a process which loads data into the online store from the offline store. By default, Feast uses a local in-process engine implementation to materialize data. However, additional infrastructure can be used for a more scalable materialization process. * **Online Store:** The online store is a database that stores only the latest feature values for each entity. The online store is either populated through materialization jobs or through [stream ingestion](../../reference/data-sources/push.md). * **Offline Store:** The offline store persists batch data that has been ingested into Feast. This data is used for producing training datasets. For feature retrieval and materialization, Feast does not manage the offline store directly, but runs queries against it. However, offline stores can be configured to support writes if Feast configures logging functionality of served features. +* **Authorization manager**: The authorization manager detects authentication tokens from client requests to Feast servers and uses this information to enforce permission policies on the requested services. diff --git a/docs/getting-started/concepts/README.md b/docs/getting-started/concepts/README.md index e805e3b486..1769a2d741 100644 --- a/docs/getting-started/concepts/README.md +++ b/docs/getting-started/concepts/README.md @@ -31,3 +31,7 @@ {% content-ref url="dataset.md" %} [dataset.md](dataset.md) {% endcontent-ref %} + +{% content-ref url="permission.md" %} +[permission.md](permission.md) +{% endcontent-ref %} diff --git a/docs/getting-started/concepts/permission.md b/docs/getting-started/concepts/permission.md new file mode 100644 index 0000000000..5bca1bd568 --- /dev/null +++ b/docs/getting-started/concepts/permission.md @@ -0,0 +1,112 @@ +# Permission + +## Overview + +The Feast permissions model allows to configure granular permission policies to all the resources defined in a feature store. + +The configured permissions are stored in the Feast registry and accessible through the CLI and the registry APIs. + +The permission authorization enforcement is performed when requests are executed through one of the Feast (Python) servers +- The online feature server (REST) +- The offline feature server (Arrow Flight) +- The registry server (gRPC) + +Note that there is no permission enforcement when accessing the Feast API with a local provider. + +## Concepts + +The permission model is based on the following components: +- A `resource` is a Feast object that we want to secure against unauthorized access. + - We assume that the resource has a `name` attribute and optional dictionary of associated key-value `tags`. +- An `action` is a logical operation executed on the secured resource, like: + - `create`: Create an instance. + - `describe`: Access the instance state. + - `update`: Update the instance state. + - `delete`: Delete an instance. + - `read`: Read both online and offline stores. + - `read_online`: Read the online store. + - `read_offline`: Read the offline store. + - `write`: Write on any store. + - `write_online`: Write to the online store. + - `write_offline`: Write to the offline store. +- A `policy` identifies the rule for enforcing authorization decisions on secured resources, based on the current user. + - A default implementation is provided for role-based policies, using the user roles to grant or deny access to the requested actions + on the secured resources. + +The `Permission` class identifies a single permission configured on the feature store and is identified by these attributes: +- `name`: The permission name. +- `types`: The list of protected resource types. Defaults to all managed types, e.g. the `ALL_RESOURCE_TYPES` alias. All sub-classes are included in the resource match. +- `name_pattern`: A regex to match the resource name. Defaults to `None`, meaning that no name filtering is applied +- `required_tags`: Dictionary of key-value pairs that must match the resource tags. Defaults to `None`, meaning that no tags filtering is applied. +- `actions`: The actions authorized by this permission. Defaults to `ALL_VALUES`, an alias defined in the `action` module. +- `policy`: The policy to be applied to validate a client request. + +To simplify configuration, several constants are defined to streamline the permissions setup: +- In module `feast.feast_object`: + - `ALL_RESOURCE_TYPES` is the list of all the `FeastObject` types. + - `ALL_FEATURE_VIEW_TYPES` is the list of all the feature view types, including those not inheriting from `FeatureView` type like + `OnDemandFeatureView`. +- In module `feast.permissions.action`: + - `ALL_ACTIONS` is the list of all managed actions. + - `READ` includes all the read actions for online and offline store. + - `WRITE` includes all the write actions for online and offline store. + - `CRUD` includes all the state management actions to create, describe, update or delete a Feast resource. + +Given the above definitions, the feature store can be configured with granular control over each resource, enabling partitioned access by +teams to meet organizational requirements for service and data sharing, and protection of sensitive information. + +The `feast` CLI includes a new `permissions` command to list the registered permissions, with options to identify the matching resources for each configured permission and the existing resources that are not covered by any permission. + +{% hint style="info" %} +**Note**: Feast resources that do not match any of the configured permissions are not secured by any authorization policy, meaning any user can execute any action on such resources. +{% endhint %} + +## Definition examples +This permission definition grants access to the resource state and the ability to read all of the stores for any feature view or +feature service to all users with the role `super-reader`: +```py +Permission( + name="feature-reader", + types=[FeatureView, FeatureService], + policy=RoleBasedPolicy(roles=["super-reader"]), + actions=[AuthzedAction.DESCRIBE, READ], +) +``` + +This example grants permission to write on all the data sources with `risk_level` tag set to `high` only to users with role `admin` or `data_team`: +```py +Permission( + name="ds-writer", + types=[DataSource], + required_tags={"risk_level": "high"}, + policy=RoleBasedPolicy(roles=["admin", "data_team"]), + actions=[AuthzedAction.WRITE], +) +``` + +{% hint style="info" %} +**Note**: When using multiple roles in a role-based policy, the user must be granted at least one of the specified roles. +{% endhint %} + + +The following permission grants authorization to read the offline store of all the feature views including `risky` in the name, to users with role `trusted`: + +```py +Permission( + name="reader", + types=[FeatureView], + name_pattern=".*risky.*", + policy=RoleBasedPolicy(roles=["trusted"]), + actions=[AuthzedAction.READ_OFFLINE], +) +``` + +## Authorization configuration +In order to leverage the permission functionality, the `auth` section is needed in the `feature_store.yaml` configuration. +Currently, Feast supports OIDC and Kubernetes RBAC authorization protocols. + +The default configuration, if you don't specify the `auth` configuration section, is `no_auth`, indicating that no permission +enforcement is applied. + +The `auth` section includes a `type` field specifying the actual authorization protocol, and protocol-specific fields that +are specified in [Authorization Manager](../components/authz_manager.md). diff --git a/docs/reference/feast-cli-commands.md b/docs/reference/feast-cli-commands.md index afcfcfef64..be31720034 100644 --- a/docs/reference/feast-cli-commands.md +++ b/docs/reference/feast-cli-commands.md @@ -24,6 +24,7 @@ Commands: init Create a new Feast repository materialize Run a (non-incremental) materialization job to... materialize-incremental Run an incremental materialization job to ingest... + permissions Access permissions registry-dump Print contents of the metadata registry teardown Tear down deployed feature store infrastructure version Display Feast SDK version @@ -155,6 +156,143 @@ Load data from feature views into the online store, beginning from either the pr feast materialize-incremental 2022-01-01T00:00:00 ``` +## Permissions + +### List permissions +List all registered permission + +```text +feast permissions list + +Options: + --tags TEXT Filter by tags (e.g. --tags 'key:value' --tags 'key:value, + key:value, ...'). Items return when ALL tags match. + -v, --verbose Print the resources matching each configured permission +``` + +```text ++-----------------------+-------------+-----------------------+-----------+----------------+-------------------------+ +| NAME | TYPES | NAME_PATTERN | ACTIONS | ROLES | REQUIRED_TAGS | ++=======================+=============+=======================+===========+================+================+========+ +| reader_permission1234 | FeatureView | transformed_conv_rate | DESCRIBE | reader | - | ++-----------------------+-------------+-----------------------+-----------+----------------+-------------------------+ +| writer_permission1234 | FeatureView | transformed_conv_rate | CREATE | writer | - | ++-----------------------+-------------+-----------------------+-----------+----------------+-------------------------+ +| special | FeatureView | special.* | DESCRIBE | admin | test-key2 : test-value2 | +| | | | UPDATE | special-reader | test-key : test-value | ++-----------------------+-------------+-----------------------+-----------+----------------+-------------------------+ +``` + +`verbose` option describes the resources matching each configured permission: + +```text +feast permissions list -v +``` + +```text +Permissions: + +permissions +├── reader_permission1234 ['reader'] +│ └── FeatureView: none +└── writer_permission1234 ['writer'] + ├── FeatureView: none + │── OnDemandFeatureView: ['transformed_conv_rate_fresh', 'transformed_conv_rate'] + └── BatchFeatureView: ['driver_hourly_stats', 'driver_hourly_stats_fresh'] +``` + +### Describe a permission +Describes the provided permission + +```text +feast permissions describe permission-name +name: permission-name +types: +- FEATURE_VIEW +namePattern: transformed_conv_rate +requiredTags: + required1: required-value1 + required2: required-value2 +actions: +- DESCRIBE +policy: + roleBasedPolicy: + roles: + - reader +tags: + key1: value1 + key2: value2 + +``` + +### List of the configured roles +List all the configured roles + +```text +feast permissions list-roles + +Options: + --verbose Print the resources and actions permitted to each configured + role +``` + +```text +ROLE NAME +admin +reader +writer +``` + +`verbose` option describes the resources and actions permitted to each managed role: + +```text +feast permissions list-roles -v +``` + +```text +ROLE NAME RESOURCE NAME RESOURCE TYPE PERMITTED ACTIONS +admin driver_hourly_stats_source FileSource CREATE + DELETE + QUERY_OFFLINE + QUERY_ONLINE + DESCRIBE + UPDATE +admin vals_to_add RequestSource CREATE + DELETE + QUERY_OFFLINE + QUERY_ONLINE + DESCRIBE + UPDATE +admin driver_stats_push_source PushSource CREATE + DELETE + QUERY_OFFLINE + QUERY_ONLINE + DESCRIBE + UPDATE +admin driver_hourly_stats_source FileSource CREATE + DELETE + QUERY_OFFLINE + QUERY_ONLINE + DESCRIBE + UPDATE +admin vals_to_add RequestSource CREATE + DELETE + QUERY_OFFLINE + QUERY_ONLINE + DESCRIBE + UPDATE +admin driver_stats_push_source PushSource CREATE + DELETE + QUERY_OFFLINE + QUERY_ONLINE + DESCRIBE + UPDATE +reader driver_hourly_stats FeatureView DESCRIBE +reader driver_hourly_stats_fresh FeatureView DESCRIBE +... +``` + + ## Teardown Tear down deployed feature store infrastructure diff --git a/docs/reference/feature-servers/offline-feature-server.md b/docs/reference/feature-servers/offline-feature-server.md index 6c2fdf7a25..1db5adacd8 100644 --- a/docs/reference/feature-servers/offline-feature-server.md +++ b/docs/reference/feature-servers/offline-feature-server.md @@ -33,3 +33,20 @@ Please see the detail how to configure offline store client [remote-offline-stor The set of functionalities supported by remote offline stores is the same as those supported by offline stores with the SDK, which are described in detail [here](../offline-stores/overview.md#functionality). +# Offline Feature Server Permissions and Access Control + +## API Endpoints and Permissions + +| Endpoint | Resource Type | Permission | Description | +| ------------------------------------- |------------------|---------------|---------------------------------------------------| +| offline_write_batch | FeatureView | Write Offline | Write a batch of data to the offline store | +| write_logged_features | FeatureService | Write Offline | Write logged features to the offline store | +| persist | DataSource | Write Offline | Persist the result of a read in the offline store | +| get_historical_features | FeatureView | Read Offline | Retrieve historical features | +| pull_all_from_table_or_query | DataSource | Read Offline | Pull all data from a table or read it | +| pull_latest_from_table_or_query | DataSource | Read Offline | Pull the latest data from a table or read it | + + +## How to configure Authentication and Authorization ? + +Please refer the [page](./../../../docs/getting-started/concepts/permission.md) for more details on how to configure authentication and authorization. \ No newline at end of file diff --git a/docs/reference/feature-servers/python-feature-server.md b/docs/reference/feature-servers/python-feature-server.md index 33dfe77ae1..255b85e606 100644 --- a/docs/reference/feature-servers/python-feature-server.md +++ b/docs/reference/feature-servers/python-feature-server.md @@ -199,3 +199,19 @@ requests.post( "http://localhost:6566/push", data=json.dumps(push_data)) ``` + +# Online Feature Server Permissions and Access Control + +## API Endpoints and Permissions + +| Endpoint | Resource Type | Permission | Description | +| ---------------------------- |---------------------------------|-------------------------------------------------------| ------------------------------------------------------------------------ | +| /get-online-features | FeatureView,OnDemandFeatureView | Read Online | Get online features from the feature store | +| /push | FeatureView | Write Online, Write Offline, Write Online and Offline | Push features to the feature store (online, offline, or both) | +| /write-to-online-store | FeatureView | Write Online | Write features to the online store | +| /materialize | FeatureView | Write Online | Materialize features within a specified time range | +| /materialize-incremental | FeatureView | Write Online | Incrementally materialize features up to a specified timestamp | + +## How to configure Authentication and Authorization ? + +Please refer the [page](./../../../docs/getting-started/concepts/permission.md) for more details on how to configure authentication and authorization. \ No newline at end of file diff --git a/docs/reference/offline-stores/remote-offline-store.md b/docs/reference/offline-stores/remote-offline-store.md index 0179e0f06f..8057ae3284 100644 --- a/docs/reference/offline-stores/remote-offline-store.md +++ b/docs/reference/offline-stores/remote-offline-store.md @@ -25,4 +25,7 @@ The complete example can be find under [remote-offline-store-example](../../../e ## How to configure the server -Please see the detail how to configure offline feature server [offline-feature-server.md](../feature-servers/offline-feature-server.md) \ No newline at end of file +Please see the detail how to configure offline feature server [offline-feature-server.md](../feature-servers/offline-feature-server.md) + +## How to configure Authentication and Authorization +Please refer the [page](./../../../docs/getting-started/concepts/permission.md) for more details on how to configure authentication and authorization. diff --git a/docs/reference/online-stores/remote.md b/docs/reference/online-stores/remote.md index c560fa6f22..4dd4fb65b5 100644 --- a/docs/reference/online-stores/remote.md +++ b/docs/reference/online-stores/remote.md @@ -11,11 +11,17 @@ The registry is pointing to registry of remote feature store. If it is not acces {% code title="feature_store.yaml" %} ```yaml project: my-local-project - registry: /remote/data/registry.db - provider: local - online_store: - path: http://localhost:6566 - type: remote - entity_key_serialization_version: 2 +registry: /remote/data/registry.db +provider: local +online_store: + path: http://localhost:6566 + type: remote +entity_key_serialization_version: 2 +auth: + type: no_auth ``` -{% endcode %} \ No newline at end of file +{% endcode %} + +## How to configure Authentication and Authorization +Please refer the [page](./../../../docs/getting-started/concepts/permission.md) for more details on how to configure authentication and authorization. + diff --git a/docs/reference/registry/registry-permissions.md b/docs/reference/registry/registry-permissions.md new file mode 100644 index 0000000000..65508ef5b2 --- /dev/null +++ b/docs/reference/registry/registry-permissions.md @@ -0,0 +1,45 @@ +# Registry Permissions and Access Control + + +## API Endpoints and Permissions + +| Endpoint | Resource Type | Permission | Description | +| ------------------------ |---------------------|------------------------| -------------------------------------------------------------- | +| ApplyEntity | Entity | Create, Update, Delete | Apply an entity to the registry | +| GetEntity | Entity | Read | Get an entity from the registry | +| ListEntities | Entity | Read | List entities in the registry | +| DeleteEntity | Entity | Delete | Delete an entity from the registry | +| ApplyDataSource | DataSource | Create, Update, Delete | Apply a data source to the registry | +| GetDataSource | DataSource | Read | Get a data source from the registry | +| ListDataSources | DataSource | Read | List data sources in the registry | +| DeleteDataSource | DataSource | Delete | Delete a data source from the registry | +| ApplyFeatureView | FeatureView | Create, Update, Delete | Apply a feature view to the registry | +| GetFeatureView | FeatureView | Read | Get a feature view from the registry | +| ListFeatureViews | FeatureView | Read | List feature views in the registry | +| DeleteFeatureView | FeatureView | Delete | Delete a feature view from the registry | +| GetStreamFeatureView | StreamFeatureView | Read | Get a stream feature view from the registry | +| ListStreamFeatureViews | StreamFeatureView | Read | List stream feature views in the registry | +| GetOnDemandFeatureView | OnDemandFeatureView | Read | Get an on-demand feature view from the registry | +| ListOnDemandFeatureViews | OnDemandFeatureView | Read | List on-demand feature views in the registry | +| ApplyFeatureService | FeatureService | Create, Update, Delete | Apply a feature service to the registry | +| GetFeatureService | FeatureService | Read | Get a feature service from the registry | +| ListFeatureServices | FeatureService | Read | List feature services in the registry | +| DeleteFeatureService | FeatureService | Delete | Delete a feature service from the registry | +| ApplySavedDataset | SavedDataset | Create, Update, Delete | Apply a saved dataset to the registry | +| GetSavedDataset | SavedDataset | Read | Get a saved dataset from the registry | +| ListSavedDatasets | SavedDataset | Read | List saved datasets in the registry | +| DeleteSavedDataset | SavedDataset | Delete | Delete a saved dataset from the registry | +| ApplyValidationReference | ValidationReference | Create, Update, Delete | Apply a validation reference to the registry | +| GetValidationReference | ValidationReference | Read | Get a validation reference from the registry | +| ListValidationReferences | ValidationReference | Read | List validation references in the registry | +| DeleteValidationReference| ValidationReference | Delete | Delete a validation reference from the registry | +| ApplyPermission | Permission | Create, Update, Delete | Apply a permission to the registry | +| GetPermission | Permission | Read | Get a permission from the registry | +| ListPermissions | Permission | Read | List permissions in the registry | +| DeletePermission | Permission | Delete | Delete a permission from the registry | +| Commit | | None | Commit changes to the registry | +| Refresh | | None | Refresh the registry | +| Proto | | None | Get the proto representation of the registry | + +## How to configure Authentication and Authorization +Please refer the [page](./../../../docs/getting-started/concepts/permission.md) for more details on how to configure authentication and authorization. diff --git a/protos/feast/core/Permission.proto b/protos/feast/core/Permission.proto new file mode 100644 index 0000000000..57958d3d81 --- /dev/null +++ b/protos/feast/core/Permission.proto @@ -0,0 +1,69 @@ +syntax = "proto3"; +package feast.core; + +option go_package = "github.com/feast-dev/feast/go/protos/feast/core"; +option java_outer_classname = "PermissionProto"; +option java_package = "feast.proto.core"; + +import "feast/core/Policy.proto"; +import "google/protobuf/timestamp.proto"; + +message Permission { + // User-specified specifications of this permission. + PermissionSpec spec = 1; + + // System-populated metadata for this permission. + PermissionMeta meta = 2; +} + +message PermissionSpec { + enum AuthzedAction { + CREATE = 0; + DESCRIBE = 1; + UPDATE = 2; + DELETE = 3; + READ_ONLINE = 4; + READ_OFFLINE = 5; + WRITE_ONLINE = 6; + WRITE_OFFLINE = 7; + } + + // Name of the permission. Must be unique. Not updated. + string name = 1; + + // Name of Feast project. + string project = 2; + + enum Type { + FEATURE_VIEW = 0; + ON_DEMAND_FEATURE_VIEW = 1; + BATCH_FEATURE_VIEW = 2; + STREAM_FEATURE_VIEW= 3; + ENTITY = 4; + FEATURE_SERVICE = 5; + DATA_SOURCE = 6; + VALIDATION_REFERENCE = 7; + SAVED_DATASET = 8; + PERMISSION = 9; + } + + repeated Type types = 3; + + string name_pattern = 4; + + map required_tags = 5; + + // List of actions. + repeated AuthzedAction actions = 6; + + // the policy. + Policy policy = 7; + + // User defined metadata + map tags = 8; +} + +message PermissionMeta { + google.protobuf.Timestamp created_timestamp = 1; + google.protobuf.Timestamp last_updated_timestamp = 2; +} diff --git a/protos/feast/core/Policy.proto b/protos/feast/core/Policy.proto new file mode 100644 index 0000000000..7ad42b9797 --- /dev/null +++ b/protos/feast/core/Policy.proto @@ -0,0 +1,23 @@ +syntax = "proto3"; +package feast.core; + +option go_package = "github.com/feast-dev/feast/go/protos/feast/core"; +option java_outer_classname = "PolicyProto"; +option java_package = "feast.proto.core"; + +message Policy { + // Name of the policy. + string name = 1; + + // Name of Feast project. + string project = 2; + + oneof policy_type { + RoleBasedPolicy role_based_policy = 3; + } +} + +message RoleBasedPolicy { + // List of roles in this policy. + repeated string roles = 1; +} diff --git a/protos/feast/core/Registry.proto b/protos/feast/core/Registry.proto index 0c3f8a53f9..b4f1ffb0a3 100644 --- a/protos/feast/core/Registry.proto +++ b/protos/feast/core/Registry.proto @@ -32,8 +32,9 @@ import "feast/core/DataSource.proto"; import "feast/core/SavedDataset.proto"; import "feast/core/ValidationProfile.proto"; import "google/protobuf/timestamp.proto"; +import "feast/core/Permission.proto"; -// Next id: 16 +// Next id: 17 message Registry { repeated Entity entities = 1; repeated FeatureTable feature_tables = 2; @@ -51,6 +52,7 @@ message Registry { string registry_schema_version = 3; // to support migrations; incremented when schema is changed string version_id = 4; // version id, random string generated on each update of the data; now used only for debugging purposes google.protobuf.Timestamp last_updated = 5; + repeated Permission permissions = 16; } message ProjectMetadata { diff --git a/protos/feast/registry/RegistryServer.proto b/protos/feast/registry/RegistryServer.proto index 44529f5409..928354077b 100644 --- a/protos/feast/registry/RegistryServer.proto +++ b/protos/feast/registry/RegistryServer.proto @@ -14,6 +14,7 @@ import "feast/core/FeatureService.proto"; import "feast/core/SavedDataset.proto"; import "feast/core/ValidationProfile.proto"; import "feast/core/InfraObject.proto"; +import "feast/core/Permission.proto"; service RegistryServer{ // Entity RPCs @@ -59,7 +60,13 @@ service RegistryServer{ rpc GetValidationReference (GetValidationReferenceRequest) returns (feast.core.ValidationReference) {} rpc ListValidationReferences (ListValidationReferencesRequest) returns (ListValidationReferencesResponse) {} rpc DeleteValidationReference (DeleteValidationReferenceRequest) returns (google.protobuf.Empty) {} - + + // Permission RPCs + rpc ApplyPermission (ApplyPermissionRequest) returns (google.protobuf.Empty) {} + rpc GetPermission (GetPermissionRequest) returns (feast.core.Permission) {} + rpc ListPermissions (ListPermissionsRequest) returns (ListPermissionsResponse) {} + rpc DeletePermission (DeletePermissionRequest) returns (google.protobuf.Empty) {} + rpc ApplyMaterialization (ApplyMaterializationRequest) returns (google.protobuf.Empty) {} rpc ListProjectMetadata (ListProjectMetadataRequest) returns (ListProjectMetadataResponse) {} rpc UpdateInfra (UpdateInfraRequest) returns (google.protobuf.Empty) {} @@ -277,6 +284,7 @@ message GetSavedDatasetRequest { message ListSavedDatasetsRequest { string project = 1; bool allow_cache = 2; + map tags = 3; } message ListSavedDatasetsResponse { @@ -306,6 +314,7 @@ message GetValidationReferenceRequest { message ListValidationReferencesRequest { string project = 1; bool allow_cache = 2; + map tags = 3; } message ListValidationReferencesResponse { @@ -316,4 +325,34 @@ message DeleteValidationReferenceRequest { string name = 1; string project = 2; bool commit = 3; -} \ No newline at end of file +} + +// Permissions + +message ApplyPermissionRequest { + feast.core.Permission permission = 1; + string project = 2; + bool commit = 3; +} + +message GetPermissionRequest { + string name = 1; + string project = 2; + bool allow_cache = 3; +} + +message ListPermissionsRequest { + string project = 1; + bool allow_cache = 2; + map tags = 3; +} + +message ListPermissionsResponse { + repeated feast.core.Permission permissions = 1; +} + +message DeletePermissionRequest { + string name = 1; + string project = 2; + bool commit = 3; +} diff --git a/sdk/python/docs/source/feast.rst b/sdk/python/docs/source/feast.rst index 95fbea8d7a..83137574dd 100644 --- a/sdk/python/docs/source/feast.rst +++ b/sdk/python/docs/source/feast.rst @@ -12,6 +12,7 @@ Subpackages feast.embedded_go feast.infra feast.loaders + feast.permissions feast.protos feast.transformation feast.ui @@ -251,6 +252,14 @@ feast.proto\_json module :undoc-members: :show-inheritance: +feast.prova module +------------------ + +.. automodule:: feast.prova + :members: + :undoc-members: + :show-inheritance: + feast.registry\_server module ----------------------------- diff --git a/sdk/python/feast/cli.py b/sdk/python/feast/cli.py index f4e3e97d27..737704dd36 100644 --- a/sdk/python/feast/cli.py +++ b/sdk/python/feast/cli.py @@ -16,23 +16,27 @@ from datetime import datetime from importlib.metadata import version as importlib_version from pathlib import Path -from typing import List, Optional +from typing import Any, List, Optional import click import yaml +from bigtree import Node from colorama import Fore, Style from dateutil import parser from pygments import formatters, highlight, lexers -from feast import utils +import feast.cli_utils as cli_utils +from feast import BatchFeatureView, Entity, FeatureService, StreamFeatureView, utils from feast.constants import ( DEFAULT_FEATURE_TRANSFORMATION_SERVER_PORT, DEFAULT_OFFLINE_SERVER_PORT, DEFAULT_REGISTRY_SERVER_PORT, ) +from feast.data_source import DataSource from feast.errors import FeastObjectNotFoundException, FeastProviderLoginError from feast.feature_view import FeatureView from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.policy import RoleBasedPolicy from feast.repo_config import load_repo_config from feast.repo_operations import ( apply_total, @@ -44,6 +48,7 @@ registry_dump, teardown, ) +from feast.saved_dataset import SavedDataset, ValidationReference from feast.utils import maybe_local_tz _logger = logging.getLogger(__name__) @@ -879,5 +884,253 @@ def validate( exit(1) +@cli.group(name="permissions") +def feast_permissions_cmd(): + """ + Access permissions + """ + pass + + +@feast_permissions_cmd.command(name="list") +@click.option( + "--verbose", + "-v", + is_flag=True, + help="Print the resources matching each configured permission", +) +@tagsOption +@click.pass_context +def feast_permissions_list_command(ctx: click.Context, verbose: bool, tags: list[str]): + from tabulate import tabulate + + table: list[Any] = [] + tags_filter = utils.tags_list_to_dict(tags) + + store = create_feature_store(ctx) + + permissions = store.list_permissions(tags=tags_filter) + + root_node = Node("permissions") + roles: set[str] = set() + + for p in permissions: + policy = p.policy + if not verbose: + cli_utils.handle_not_verbose_permissions_command(p, policy, table) + else: + if isinstance(policy, RoleBasedPolicy) and len(policy.get_roles()) > 0: + roles = set(policy.get_roles()) + permission_node = Node( + p.name + " " + str(list(roles)), parent=root_node + ) + else: + permission_node = Node(p.name, parent=root_node) + + for feast_type in p.types: + if feast_type in [ + FeatureView, + OnDemandFeatureView, + BatchFeatureView, + StreamFeatureView, + ]: + cli_utils.handle_fv_verbose_permissions_command( + feast_type, # type: ignore[arg-type] + p, + permission_node, + store, + tags_filter, + ) + elif feast_type == Entity: + cli_utils.handle_entity_verbose_permissions_command( + feast_type, # type: ignore[arg-type] + p, + permission_node, + store, + tags_filter, + ) + elif feast_type == FeatureService: + cli_utils.handle_fs_verbose_permissions_command( + feast_type, # type: ignore[arg-type] + p, + permission_node, + store, + tags_filter, + ) + elif feast_type == DataSource: + cli_utils.handle_ds_verbose_permissions_command( + feast_type, # type: ignore[arg-type] + p, + permission_node, + store, + tags_filter, + ) + elif feast_type == ValidationReference: + cli_utils.handle_vr_verbose_permissions_command( + feast_type, # type: ignore[arg-type] + p, + permission_node, + store, + tags_filter, + ) + elif feast_type == SavedDataset: + cli_utils.handle_sd_verbose_permissions_command( + feast_type, # type: ignore[arg-type] + p, + permission_node, + store, + tags_filter, + ) + + if not verbose: + print( + tabulate( + table, + headers=[ + "NAME", + "TYPES", + "NAME_PATTERN", + "ACTIONS", + "ROLES", + "REQUIRED_TAGS", + ], + tablefmt="plain", + ) + ) + else: + cli_utils.print_permission_verbose_example() + + print("Permissions:") + print("") + root_node.show() + + +@feast_permissions_cmd.command("describe") +@click.argument("name", type=click.STRING) +@click.pass_context +def permission_describe(ctx: click.Context, name: str): + """ + Describe a permission + """ + store = create_feature_store(ctx) + + try: + permission = store.get_permission(name) + except FeastObjectNotFoundException as e: + print(e) + exit(1) + + print( + yaml.dump( + yaml.safe_load(str(permission)), default_flow_style=False, sort_keys=False + ) + ) + + +@feast_permissions_cmd.command(name="check") +@click.pass_context +def feast_permissions_check_command(ctx: click.Context): + """ + Validate the permissions configuration + """ + from tabulate import tabulate + + all_unsecured_table: list[Any] = [] + store = create_feature_store(ctx) + permissions = store.list_permissions() + objects = cli_utils.fetch_all_feast_objects( + store=store, + ) + + print( + f"{Style.BRIGHT + Fore.RED}The following resources are not secured by any permission configuration:{Style.RESET_ALL}" + ) + for o in objects: + cli_utils.handle_permissions_check_command( + object=o, permissions=permissions, table=all_unsecured_table + ) + print( + tabulate( + all_unsecured_table, + headers=[ + "NAME", + "TYPE", + ], + tablefmt="plain", + ) + ) + + all_unsecured_actions_table: list[Any] = [] + print( + f"{Style.BRIGHT + Fore.RED}The following actions are not secured by any permission configuration (Note: this might not be a security concern, depending on the used APIs):{Style.RESET_ALL}" + ) + for o in objects: + cli_utils.handle_permissions_check_command_with_actions( + object=o, permissions=permissions, table=all_unsecured_actions_table + ) + print( + tabulate( + all_unsecured_actions_table, + headers=[ + "NAME", + "TYPE", + "UNSECURED ACTIONS", + ], + tablefmt="plain", + ) + ) + + +@feast_permissions_cmd.command(name="list-roles") +@click.option( + "--verbose", + "-v", + is_flag=True, + help="Print the resources and actions permitted to each configured role", +) +@click.pass_context +def feast_permissions_list_roles_command(ctx: click.Context, verbose: bool): + """ + List all the configured roles + """ + from tabulate import tabulate + + table: list[Any] = [] + store = create_feature_store(ctx) + permissions = store.list_permissions() + if not verbose: + cli_utils.handler_list_all_permissions_roles( + permissions=permissions, table=table + ) + print( + tabulate( + table, + headers=[ + "ROLE NAME", + ], + tablefmt="grid", + ) + ) + else: + objects = cli_utils.fetch_all_feast_objects( + store=store, + ) + cli_utils.handler_list_all_permissions_roles_verbose( + objects=objects, permissions=permissions, table=table + ) + print( + tabulate( + table, + headers=[ + "ROLE NAME", + "RESOURCE NAME", + "RESOURCE TYPE", + "PERMITTED ACTIONS", + ], + tablefmt="plain", + ) + ) + + if __name__ == "__main__": cli() diff --git a/sdk/python/feast/cli_utils.py b/sdk/python/feast/cli_utils.py new file mode 100644 index 0000000000..edfdab93e3 --- /dev/null +++ b/sdk/python/feast/cli_utils.py @@ -0,0 +1,329 @@ +from typing import Any, Optional + +from bigtree import Node +from colorama import Fore, Style + +from feast import ( + BatchFeatureView, + FeatureService, + FeatureStore, + FeatureView, + OnDemandFeatureView, + StreamFeatureView, +) +from feast.feast_object import FeastObject +from feast.permissions.action import ALL_ACTIONS +from feast.permissions.decision import DecisionEvaluator +from feast.permissions.permission import Permission +from feast.permissions.policy import Policy, RoleBasedPolicy +from feast.permissions.user import User + + +def print_permission_verbose_example(): + print("") + print( + f"{Style.BRIGHT + Fore.GREEN}The structure of the {Style.BRIGHT + Fore.WHITE}feast-permissions list --verbose {Style.BRIGHT + Fore.GREEN}command will be as in the following example:" + ) + print("") + print(f"{Style.DIM}For example: {Style.RESET_ALL}{Style.BRIGHT + Fore.GREEN}") + print("") + explanation_root_node = Node("permissions") + explanation_permission_node = Node( + "permission_1" + " " + str(["role names list"]), + parent=explanation_root_node, + ) + Node( + FeatureView.__name__ + ": " + str(["feature view names"]), + parent=explanation_permission_node, + ) + Node(FeatureService.__name__ + ": none", parent=explanation_permission_node) + Node("..", parent=explanation_permission_node) + Node( + "permission_2" + " " + str(["role names list"]), + parent=explanation_root_node, + ) + Node("..", parent=explanation_root_node) + explanation_root_node.show() + print( + f""" +-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------{Style.RESET_ALL} + """ + ) + + +def handle_sd_verbose_permissions_command( + feast_type: list[FeastObject], + p: Permission, + policy_node: Node, + store: FeatureStore, + tags_filter: Optional[dict[str, str]], +): + saved_datasets = store.list_saved_datasets(tags=tags_filter) + saved_datasets_names = set() + for sd in saved_datasets: + if p.match_resource(sd): + saved_datasets_names.add(sd.name) + if len(saved_datasets_names) > 0: + Node( + feast_type.__name__ + ": " + str(list(saved_datasets_names)), # type: ignore[union-attr, attr-defined] + parent=policy_node, + ) + else: + Node(feast_type.__name__ + ": none", parent=policy_node) # type: ignore[union-attr, attr-defined] + + +def handle_vr_verbose_permissions_command( + feast_type: list[FeastObject], + p: Permission, + policy_node: Node, + store: FeatureStore, + tags_filter: Optional[dict[str, str]], +): + validation_references = store.list_validation_references(tags=tags_filter) + validation_references_names = set() + for vr in validation_references: + if p.match_resource(vr): + validation_references_names.add(vr.name) + if len(validation_references_names) > 0: + Node( + feast_type.__name__ + ": " + str(list(validation_references_names)), # type: ignore[union-attr, attr-defined] + parent=policy_node, + ) + else: + Node(feast_type.__name__ + ": none", parent=policy_node) # type: ignore[union-attr, attr-defined] + + +def handle_ds_verbose_permissions_command( + feast_type: list[FeastObject], + p: Permission, + policy_node: Node, + store: FeatureStore, + tags_filter: Optional[dict[str, str]], +): + data_sources = store.list_data_sources(tags=tags_filter) + data_sources_names = set() + for ds in data_sources: + if p.match_resource(ds): + data_sources_names.add(ds.name) + if len(data_sources_names) > 0: + Node( + feast_type.__name__ + ": " + str(list(data_sources_names)), # type: ignore[union-attr, attr-defined] + parent=policy_node, + ) + else: + Node(feast_type.__name__ + ": none", parent=policy_node) # type: ignore[union-attr, attr-defined] + + +def handle_fs_verbose_permissions_command( + feast_type: list[FeastObject], + p: Permission, + policy_node: Node, + store: FeatureStore, + tags_filter: Optional[dict[str, str]], +): + feature_services = store.list_feature_services(tags=tags_filter) + feature_services_names = set() + for fs in feature_services: + if p.match_resource(fs): + feature_services_names.add(fs.name) + if len(feature_services_names) > 0: + Node( + feast_type.__name__ + ": " + str(list(feature_services_names)), # type: ignore[union-attr, attr-defined] + parent=policy_node, + ) + else: + Node(feast_type.__name__ + ": none", parent=policy_node) # type: ignore[union-attr, attr-defined] + + +def handle_entity_verbose_permissions_command( + feast_type: list[FeastObject], + p: Permission, + policy_node: Node, + store: FeatureStore, + tags_filter: Optional[dict[str, str]], +): + entities = store.list_entities(tags=tags_filter) + entities_names = set() + for e in entities: + if p.match_resource(e): + entities_names.add(e.name) + if len(entities_names) > 0: + Node(feast_type.__name__ + ": " + str(list(entities_names)), parent=policy_node) # type: ignore[union-attr, attr-defined] + else: + Node(feast_type.__name__ + ": none", parent=policy_node) # type: ignore[union-attr, attr-defined] + + +def handle_fv_verbose_permissions_command( + feast_type: list[FeastObject], + p: Permission, + policy_node: Node, + store: FeatureStore, + tags_filter: Optional[dict[str, str]], +): + feature_views = [] + feature_views_names = set() + if feast_type == FeatureView: + feature_views = store.list_all_feature_views(tags=tags_filter) # type: ignore[assignment] + elif feast_type == OnDemandFeatureView: + feature_views = store.list_on_demand_feature_views( + tags=tags_filter # type: ignore[assignment] + ) + elif feast_type == BatchFeatureView: + feature_views = store.list_batch_feature_views(tags=tags_filter) # type: ignore[assignment] + elif feast_type == StreamFeatureView: + feature_views = store.list_stream_feature_views( + tags=tags_filter # type: ignore[assignment] + ) + for fv in feature_views: + if p.match_resource(fv): + feature_views_names.add(fv.name) + if len(feature_views_names) > 0: + Node( + feast_type.__name__ + " " + str(list(feature_views_names)), # type: ignore[union-attr, attr-defined] + parent=policy_node, + ) + else: + Node(feast_type.__name__ + ": none", parent=policy_node) # type: ignore[union-attr, attr-defined] + + +def handle_not_verbose_permissions_command( + p: Permission, policy: Policy, table: list[Any] +): + roles: set[str] = set() + if isinstance(policy, RoleBasedPolicy): + roles = set(policy.get_roles()) + table.append( + [ + p.name, + _to_multi_line([t.__name__ for t in p.types]), # type: ignore[union-attr, attr-defined] + p.name_pattern, + _to_multi_line([a.value.upper() for a in p.actions]), + _to_multi_line(sorted(roles)), + _dict_to_multi_line(p.required_tags), + ], + ) + + +def fetch_all_feast_objects(store: FeatureStore) -> list[FeastObject]: + objects: list[FeastObject] = [] + objects.extend(store.list_entities()) + objects.extend(store.list_all_feature_views()) + objects.extend(store.list_batch_feature_views()) + objects.extend(store.list_feature_services()) + objects.extend(store.list_data_sources()) + objects.extend(store.list_validation_references()) + objects.extend(store.list_saved_datasets()) + objects.extend(store.list_permissions()) + return objects + + +def handle_permissions_check_command( + object: FeastObject, permissions: list[Permission], table: list[Any] +): + for p in permissions: + if p.match_resource(object): + return + table.append( + [ + object.name, + type(object).__name__, + ] + ) + + +def handle_permissions_check_command_with_actions( + object: FeastObject, permissions: list[Permission], table: list[Any] +): + unmatched_actions = ALL_ACTIONS.copy() + for p in permissions: + if p.match_resource(object): + for action in ALL_ACTIONS: + if p.match_actions([action]) and action in unmatched_actions: + unmatched_actions.remove(action) + + if unmatched_actions: + table.append( + [ + object.name, + type(object).__name__, + _to_multi_line([a.value.upper() for a in unmatched_actions]), + ] + ) + + +def fetch_all_permission_roles(permissions: list[Permission]) -> list[str]: + all_roles = set() + for p in permissions: + if isinstance(p.policy, RoleBasedPolicy) and len(p.policy.get_roles()) > 0: + all_roles.update(p.policy.get_roles()) + + return sorted(all_roles) + + +def handler_list_all_permissions_roles(permissions: list[Permission], table: list[Any]): + all_roles = fetch_all_permission_roles(permissions) + for role in all_roles: + table.append( + [ + role, + ] + ) + + +def handler_list_all_permissions_roles_verbose( + objects: list[FeastObject], permissions: list[Permission], table: list[Any] +): + all_roles = fetch_all_permission_roles(permissions) + + for role in all_roles: + for o in objects: + permitted_actions = ALL_ACTIONS.copy() + for action in ALL_ACTIONS: + # Following code is derived from enforcer.enforce_policy but has a different return type and does not raise PermissionError + matching_permissions = [ + p + for p in permissions + if p.match_resource(o) and p.match_actions([action]) + ] + + if matching_permissions: + evaluator = DecisionEvaluator( + len(matching_permissions), + ) + for p in matching_permissions: + permission_grant, permission_explanation = ( + p.policy.validate_user(user=User(username="", roles=[role])) + ) + evaluator.add_grant( + permission_grant, + f"Permission {p.name} denied access: {permission_explanation}", + ) + + if evaluator.is_decided(): + grant, explanations = evaluator.grant() + if not grant: + permitted_actions.remove(action) + break + else: + permitted_actions.remove(action) + + table.append( + [ + role, + o.name, + type(o).__name__, + _to_multi_line([a.value.upper() for a in permitted_actions]), + ] + ) + + +def _to_multi_line(values: list[str]) -> str: + if not values: + return "-" + return "\n".join(values) + + +def _dict_to_multi_line(values: dict[str, str]) -> str: + if not values: + return "-" + return "\n".join([f"{key} : {value}" for key, value in values.items()]) diff --git a/sdk/python/feast/diff/registry_diff.py b/sdk/python/feast/diff/registry_diff.py index 9236b087d4..6235025adc 100644 --- a/sdk/python/feast/diff/registry_diff.py +++ b/sdk/python/feast/diff/registry_diff.py @@ -10,6 +10,7 @@ from feast.feature_view import DUMMY_ENTITY_NAME from feast.infra.registry.base_registry import BaseRegistry from feast.infra.registry.registry import FEAST_OBJECT_TYPES, FeastObjectType +from feast.permissions.permission import Permission from feast.protos.feast.core.DataSource_pb2 import DataSource as DataSourceProto from feast.protos.feast.core.Entity_pb2 import Entity as EntityProto from feast.protos.feast.core.FeatureService_pb2 import ( @@ -20,6 +21,7 @@ OnDemandFeatureView as OnDemandFeatureViewProto, ) from feast.protos.feast.core.OnDemandFeatureView_pb2 import OnDemandFeatureViewSpec +from feast.protos.feast.core.Permission_pb2 import Permission as PermissionProto from feast.protos.feast.core.SavedDataset_pb2 import SavedDataset as SavedDatasetProto from feast.protos.feast.core.StreamFeatureView_pb2 import ( StreamFeatureView as StreamFeatureViewProto, @@ -111,6 +113,7 @@ def tag_objects_for_keep_delete_update_add( StreamFeatureViewProto, ValidationReferenceProto, SavedDatasetProto, + PermissionProto, ) @@ -354,6 +357,15 @@ def apply_diff_to_registry( project, commit=False, ) + elif feast_object_diff.feast_object_type == FeastObjectType.PERMISSION: + permission_obj = cast( + Permission, feast_object_diff.current_feast_object + ) + registry.delete_permission( + permission_obj.name, + project, + commit=False, + ) if feast_object_diff.transition_type in [ TransitionType.CREATE, @@ -387,6 +399,12 @@ def apply_diff_to_registry( project, commit=False, ) + elif feast_object_diff.feast_object_type == FeastObjectType.PERMISSION: + registry.apply_permission( + cast(Permission, feast_object_diff.new_feast_object), + project, + commit=False, + ) if commit: registry.commit() diff --git a/sdk/python/feast/errors.py b/sdk/python/feast/errors.py index c4c1157626..ffafe31125 100644 --- a/sdk/python/feast/errors.py +++ b/sdk/python/feast/errors.py @@ -223,6 +223,13 @@ def __init__(self, online_store_class_name: str): ) +class FeastInvalidAuthConfigClass(Exception): + def __init__(self, auth_config_class_name: str): + super().__init__( + f"Auth Config Class '{auth_config_class_name}' should end with the string `AuthConfig`.'" + ) + + class FeastInvalidBaseClass(Exception): def __init__(self, class_name: str, class_type: str): super().__init__( @@ -391,6 +398,19 @@ def __init__(self, input_dict: dict): ) +class PermissionNotFoundException(Exception): + def __init__(self, name, project): + super().__init__(f"Permission {name} does not exist in project {project}") + + +class PermissionObjectNotFoundException(FeastObjectNotFoundException): + def __init__(self, name, project=None): + if project: + super().__init__(f"Permission {name} does not exist in project {project}") + else: + super().__init__(f"Permission {name} does not exist") + + class ZeroRowsQueryResult(Exception): def __init__(self, query: str): super().__init__(f"This query returned zero rows:\n{query}") diff --git a/sdk/python/feast/feast_object.py b/sdk/python/feast/feast_object.py index d9505dcb9f..dfe29b7128 100644 --- a/sdk/python/feast/feast_object.py +++ b/sdk/python/feast/feast_object.py @@ -1,4 +1,4 @@ -from typing import Union +from typing import Union, get_args from .batch_feature_view import BatchFeatureView from .data_source import DataSource @@ -6,11 +6,13 @@ from .feature_service import FeatureService from .feature_view import FeatureView from .on_demand_feature_view import OnDemandFeatureView +from .permissions.permission import Permission from .protos.feast.core.DataSource_pb2 import DataSource as DataSourceProto from .protos.feast.core.Entity_pb2 import EntitySpecV2 from .protos.feast.core.FeatureService_pb2 import FeatureServiceSpec from .protos.feast.core.FeatureView_pb2 import FeatureViewSpec from .protos.feast.core.OnDemandFeatureView_pb2 import OnDemandFeatureViewSpec +from .protos.feast.core.Permission_pb2 import PermissionSpec as PermissionSpec from .protos.feast.core.SavedDataset_pb2 import SavedDatasetSpec from .protos.feast.core.StreamFeatureView_pb2 import StreamFeatureViewSpec from .protos.feast.core.ValidationProfile_pb2 import ( @@ -30,6 +32,7 @@ DataSource, ValidationReference, SavedDataset, + Permission, ] FeastObjectSpecProto = Union[ @@ -41,4 +44,13 @@ DataSourceProto, ValidationReferenceProto, SavedDatasetSpec, + PermissionSpec, +] + +ALL_RESOURCE_TYPES = list(get_args(FeastObject)) +ALL_FEATURE_VIEW_TYPES = [ + FeatureView, + OnDemandFeatureView, + BatchFeatureView, + StreamFeatureView, ] diff --git a/sdk/python/feast/feature_server.py b/sdk/python/feast/feature_server.py index 908c9741c2..7f24580b7a 100644 --- a/sdk/python/feast/feature_server.py +++ b/sdk/python/feast/feature_server.py @@ -9,9 +9,8 @@ import pandas as pd import psutil from dateutil import parser -from fastapi import FastAPI, HTTPException, Request, Response, status +from fastapi import Depends, FastAPI, HTTPException, Request, Response, status from fastapi.logger import logger -from fastapi.params import Depends from google.protobuf.json_format import MessageToDict from prometheus_client import Gauge, start_http_server from pydantic import BaseModel @@ -20,7 +19,16 @@ from feast import proto_json, utils from feast.constants import DEFAULT_FEATURE_SERVER_REGISTRY_TTL from feast.data_source import PushMode -from feast.errors import PushSourceNotFoundException +from feast.errors import FeatureViewNotFoundException, PushSourceNotFoundException +from feast.permissions.action import WRITE, AuthzedAction +from feast.permissions.security_manager import assert_permissions +from feast.permissions.server.rest import inject_user_details +from feast.permissions.server.utils import ( + ServerType, + init_auth_manager, + init_security_manager, + str_to_auth_manager_type, +) # Define prometheus metrics cpu_usage_gauge = Gauge( @@ -93,23 +101,48 @@ async def lifespan(app: FastAPI): async def get_body(request: Request): return await request.body() - @app.post("/get-online-features") + # TODO RBAC: complete the dependencies for the other endpoints + @app.post( + "/get-online-features", + dependencies=[Depends(inject_user_details)], + ) def get_online_features(body=Depends(get_body)): try: body = json.loads(body) + full_feature_names = body.get("full_feature_names", False) + entity_rows = body["entities"] # Initialize parameters for FeatureStore.get_online_features(...) call if "feature_service" in body: - features = store.get_feature_service( + feature_service = store.get_feature_service( body["feature_service"], allow_cache=True ) + assert_permissions( + resource=feature_service, actions=[AuthzedAction.READ_ONLINE] + ) + features = feature_service else: features = body["features"] - - full_feature_names = body.get("full_feature_names", False) + all_feature_views, all_on_demand_feature_views = ( + utils._get_feature_views_to_use( + store.registry, + store.project, + features, + allow_cache=True, + hide_dummy_entity=False, + ) + ) + for feature_view in all_feature_views: + assert_permissions( + resource=feature_view, actions=[AuthzedAction.READ_ONLINE] + ) + for od_feature_view in all_on_demand_feature_views: + assert_permissions( + resource=od_feature_view, actions=[AuthzedAction.READ_ONLINE] + ) response_proto = store.get_online_features( features=features, - entity_rows=body["entities"], + entity_rows=entity_rows, full_feature_names=full_feature_names, ).proto @@ -123,21 +156,46 @@ def get_online_features(body=Depends(get_body)): # Raise HTTPException to return the error message to the client raise HTTPException(status_code=500, detail=str(e)) - @app.post("/push") + @app.post("/push", dependencies=[Depends(inject_user_details)]) def push(body=Depends(get_body)): try: request = PushFeaturesRequest(**json.loads(body)) df = pd.DataFrame(request.df) + actions = [] if request.to == "offline": to = PushMode.OFFLINE + actions = [AuthzedAction.WRITE_OFFLINE] elif request.to == "online": to = PushMode.ONLINE + actions = [AuthzedAction.WRITE_ONLINE] elif request.to == "online_and_offline": to = PushMode.ONLINE_AND_OFFLINE + actions = WRITE else: raise ValueError( f"{request.to} is not a supported push format. Please specify one of these ['online', 'offline', 'online_and_offline']." ) + + from feast.data_source import PushSource + + all_fvs = store.list_feature_views( + allow_cache=request.allow_registry_cache + ) + store.list_stream_feature_views( + allow_cache=request.allow_registry_cache + ) + fvs_with_push_sources = { + fv + for fv in all_fvs + if ( + fv.stream_source is not None + and isinstance(fv.stream_source, PushSource) + and fv.stream_source.name == request.push_source_name + ) + } + + for feature_view in fvs_with_push_sources: + assert_permissions(resource=feature_view, actions=actions) + store.push( push_source_name=request.push_source_name, df=df, @@ -155,15 +213,29 @@ def push(body=Depends(get_body)): # Raise HTTPException to return the error message to the client raise HTTPException(status_code=500, detail=str(e)) - @app.post("/write-to-online-store") + @app.post("/write-to-online-store", dependencies=[Depends(inject_user_details)]) def write_to_online_store(body=Depends(get_body)): try: request = WriteToFeatureStoreRequest(**json.loads(body)) df = pd.DataFrame(request.df) + feature_view_name = request.feature_view_name + allow_registry_cache = request.allow_registry_cache + try: + feature_view = store.get_stream_feature_view( + feature_view_name, allow_registry_cache=allow_registry_cache + ) + except FeatureViewNotFoundException: + feature_view = store.get_feature_view( + feature_view_name, allow_registry_cache=allow_registry_cache + ) + + assert_permissions( + resource=feature_view, actions=[AuthzedAction.WRITE_ONLINE] + ) store.write_to_online_store( - feature_view_name=request.feature_view_name, + feature_view_name=feature_view_name, df=df, - allow_registry_cache=request.allow_registry_cache, + allow_registry_cache=allow_registry_cache, ) except Exception as e: # Print the original exception on the server side @@ -175,10 +247,14 @@ def write_to_online_store(body=Depends(get_body)): def health(): return Response(status_code=status.HTTP_200_OK) - @app.post("/materialize") + @app.post("/materialize", dependencies=[Depends(inject_user_details)]) def materialize(body=Depends(get_body)): try: request = MaterializeRequest(**json.loads(body)) + for feature_view in request.feature_views: + assert_permissions( + resource=feature_view, actions=[AuthzedAction.WRITE_ONLINE] + ) store.materialize( utils.make_tzaware(parser.parse(request.start_ts)), utils.make_tzaware(parser.parse(request.end_ts)), @@ -190,10 +266,14 @@ def materialize(body=Depends(get_body)): # Raise HTTPException to return the error message to the client raise HTTPException(status_code=500, detail=str(e)) - @app.post("/materialize-incremental") + @app.post("/materialize-incremental", dependencies=[Depends(inject_user_details)]) def materialize_incremental(body=Depends(get_body)): try: request = MaterializeIncrementalRequest(**json.loads(body)) + for feature_view in request.feature_views: + assert_permissions( + resource=feature_view, actions=[AuthzedAction.WRITE_ONLINE] + ) store.materialize_incremental( utils.make_tzaware(parser.parse(request.end_ts)), request.feature_views ) @@ -231,15 +311,15 @@ def load(self): def monitor_resources(self, interval: int = 5): """Function to monitor and update CPU and memory usage metrics.""" - print(f"Start monitor_resources({interval})") + logger.debug(f"Starting resource monitoring with interval {interval} seconds") p = psutil.Process() - print(f"PID is {p.pid}") + logger.debug(f"PID is {p.pid}") while True: with p.oneshot(): cpu_usage = p.cpu_percent() memory_usage = p.memory_percent() - print(f"cpu_usage is {cpu_usage}") - print(f"memory_usage is {memory_usage}") + logger.debug(f"CPU usage: {cpu_usage}%, Memory usage: {memory_usage}%") + logger.debug(f"CPU usage: {cpu_usage}%, Memory usage: {memory_usage}%") cpu_usage_gauge.set(cpu_usage) memory_usage_gauge.set(memory_usage) time.sleep(interval) @@ -256,15 +336,27 @@ def start_server( metrics: bool, ): if metrics: - print("Start Prometheus Server") + logger.info("Starting Prometheus Server") start_http_server(8000) - print("Start a background thread to monitor CPU and memory usage") + logger.debug("Starting background thread to monitor CPU and memory usage") monitoring_thread = threading.Thread( target=monitor_resources, args=(5,), daemon=True ) monitoring_thread.start() + logger.debug("start_server called") + auth_type = str_to_auth_manager_type(store.config.auth_config.type) + logger.info(f"Auth type: {auth_type}") + init_security_manager(auth_type=auth_type, fs=store) + logger.debug("Security manager initialized successfully") + init_auth_manager( + auth_type=auth_type, + server_type=ServerType.REST, + auth_config=store.config.auth_config, + ) + logger.debug("Auth manager initialized successfully") + if sys.platform != "win32": FeastServeApplication( store=store, diff --git a/sdk/python/feast/feature_store.py b/sdk/python/feast/feature_store.py index 77638f5a62..a03706e56f 100644 --- a/sdk/python/feast/feature_store.py +++ b/sdk/python/feast/feature_store.py @@ -76,6 +76,7 @@ from feast.infra.registry.sql import SqlRegistry from feast.on_demand_feature_view import OnDemandFeatureView from feast.online_response import OnlineResponse +from feast.permissions.permission import Permission from feast.protos.feast.core.InfraObject_pb2 import Infra as InfraProto from feast.protos.feast.serving.ServingService_pb2 import ( FieldStatus, @@ -157,9 +158,16 @@ def __init__( elif registry_config and registry_config.registry_type == "remote": from feast.infra.registry.remote import RemoteRegistry - self._registry = RemoteRegistry(registry_config, self.config.project, None) + self._registry = RemoteRegistry( + registry_config, self.config.project, None, self.config.auth_config + ) else: - r = Registry(self.config.project, registry_config, repo_path=self.repo_path) + r = Registry( + self.config.project, + registry_config, + repo_path=self.repo_path, + auth_config=self.config.auth_config, + ) r._initialize_registry(self.config.project) self._registry = r @@ -199,7 +207,10 @@ def refresh_registry(self): """ registry_config = self.config.registry registry = Registry( - self.config.project, registry_config, repo_path=self.repo_path + self.config.project, + registry_config, + repo_path=self.repo_path, + auth_config=self.config.auth_config, ) registry.refresh(self.config.project) @@ -734,7 +745,8 @@ def plan( ... on_demand_feature_views=list(), ... stream_feature_views=list(), ... entities=[driver], - ... feature_services=list())) # register entity and feature view + ... feature_services=list(), + ... permissions=list())) # register entity and feature view """ # Validate and run inference on all the objects to be registered. self._validate_all_feature_views( @@ -798,6 +810,7 @@ def apply( StreamFeatureView, FeatureService, ValidationReference, + Permission, List[FeastObject], ], objects_to_delete: Optional[List[FeastObject]] = None, @@ -869,6 +882,7 @@ def apply( validation_references_to_update = [ ob for ob in objects if isinstance(ob, ValidationReference) ] + permissions_to_update = [ob for ob in objects if isinstance(ob, Permission)] batch_sources_to_add: List[DataSource] = [] for data_source in data_sources_set_to_update: @@ -924,10 +938,15 @@ def apply( self._registry.apply_validation_reference( validation_references, project=self.project, commit=False ) + for permission in permissions_to_update: + self._registry.apply_permission( + permission, project=self.project, commit=False + ) entities_to_delete = [] views_to_delete = [] sfvs_to_delete = [] + permissions_to_delete = [] if not partial: # Delete all registry objects that should not exist. entities_to_delete = [ @@ -956,6 +975,9 @@ def apply( validation_references_to_delete = [ ob for ob in objects_to_delete if isinstance(ob, ValidationReference) ] + permissions_to_delete = [ + ob for ob in objects_to_delete if isinstance(ob, Permission) + ] for data_source in data_sources_to_delete: self._registry.delete_data_source( @@ -985,6 +1007,10 @@ def apply( self._registry.delete_validation_reference( validation_references.name, project=self.project, commit=False ) + for permission in permissions_to_delete: + self._registry.delete_permission( + permission.name, project=self.project, commit=False + ) tables_to_delete: List[FeatureView] = ( views_to_delete + sfvs_to_delete if not partial else [] # type: ignore @@ -1915,6 +1941,72 @@ def get_validation_reference( ref._dataset = self.get_saved_dataset(ref.dataset_name) return ref + def list_validation_references( + self, allow_cache: bool = False, tags: Optional[dict[str, str]] = None + ) -> List[ValidationReference]: + """ + Retrieves the list of validation references from the registry. + + Args: + allow_cache: Whether to allow returning validation references from a cached registry. + tags: Filter by tags. + + Returns: + A list of validation references. + """ + return self._registry.list_validation_references( + self.project, allow_cache=allow_cache, tags=tags + ) + + def list_permissions( + self, allow_cache: bool = False, tags: Optional[dict[str, str]] = None + ) -> List[Permission]: + """ + Retrieves the list of permissions from the registry. + + Args: + allow_cache: Whether to allow returning permissions from a cached registry. + tags: Filter by tags. + + Returns: + A list of permissions. + """ + return self._registry.list_permissions( + self.project, allow_cache=allow_cache, tags=tags + ) + + def get_permission(self, name: str) -> Permission: + """ + Retrieves a permission from the registry. + + Args: + name: Name of the permission. + + Returns: + The specified permission. + + Raises: + PermissionObjectNotFoundException: The permission could not be found. + """ + return self._registry.get_permission(name, self.project) + + def list_saved_datasets( + self, allow_cache: bool = False, tags: Optional[dict[str, str]] = None + ) -> List[SavedDataset]: + """ + Retrieves the list of saved datasets from the registry. + + Args: + allow_cache: Whether to allow returning saved datasets from a cached registry. + tags: Filter by tags. + + Returns: + A list of saved datasets. + """ + return self._registry.list_saved_datasets( + self.project, allow_cache=allow_cache, tags=tags + ) + def _print_materialization_log( start_date, end_date, num_feature_views: int, online_store: str diff --git a/sdk/python/feast/infra/offline_stores/remote.py b/sdk/python/feast/infra/offline_stores/remote.py index dc657017d9..40239c8950 100644 --- a/sdk/python/feast/infra/offline_stores/remote.py +++ b/sdk/python/feast/infra/offline_stores/remote.py @@ -27,6 +27,9 @@ RetrievalMetadata, ) from feast.infra.registry.base_registry import BaseRegistry +from feast.permissions.client.arrow_flight_auth_interceptor import ( + build_arrow_flight_client, +) from feast.repo_config import FeastConfigBaseModel, RepoConfig from feast.saved_dataset import SavedDatasetStorage @@ -69,7 +72,11 @@ def _to_df_internal(self, timeout: Optional[int] = None) -> pd.DataFrame: # This is where do_get service is invoked def _to_arrow_internal(self, timeout: Optional[int] = None) -> pa.Table: return _send_retrieve_remote( - self.api, self.api_parameters, self.entity_df, self.table, self.client + self.api, + self.api_parameters, + self.entity_df, + self.table, + self.client, ) @property @@ -128,8 +135,9 @@ def get_historical_features( ) -> RemoteRetrievalJob: assert isinstance(config.offline_store, RemoteOfflineStoreConfig) - # Initialize the client connection - client = RemoteOfflineStore.init_client(config) + client = build_arrow_flight_client( + config.offline_store.host, config.offline_store.port, config.auth_config + ) feature_view_names = [fv.name for fv in feature_views] name_aliases = [fv.projection.name_alias for fv in feature_views] @@ -163,7 +171,9 @@ def pull_all_from_table_or_query( assert isinstance(config.offline_store, RemoteOfflineStoreConfig) # Initialize the client connection - client = RemoteOfflineStore.init_client(config) + client = build_arrow_flight_client( + config.offline_store.host, config.offline_store.port, config.auth_config + ) api_parameters = { "data_source_name": data_source.name, @@ -194,7 +204,9 @@ def pull_latest_from_table_or_query( assert isinstance(config.offline_store, RemoteOfflineStoreConfig) # Initialize the client connection - client = RemoteOfflineStore.init_client(config) + client = build_arrow_flight_client( + config.offline_store.host, config.offline_store.port, config.auth_config + ) api_parameters = { "data_source_name": data_source.name, @@ -227,7 +239,9 @@ def write_logged_features( data = pyarrow.parquet.read_table(data, use_threads=False, pre_buffer=False) # Initialize the client connection - client = RemoteOfflineStore.init_client(config) + client = build_arrow_flight_client( + config.offline_store.host, config.offline_store.port, config.auth_config + ) api_parameters = { "feature_service_name": source._feature_service.name, @@ -251,7 +265,9 @@ def offline_write_batch( assert isinstance(config.offline_store, RemoteOfflineStoreConfig) # Initialize the client connection - client = RemoteOfflineStore.init_client(config) + client = build_arrow_flight_client( + config.offline_store.host, config.offline_store.port, config.auth_config + ) feature_view_names = [feature_view.name] name_aliases = [feature_view.projection.name_alias] @@ -270,13 +286,6 @@ def offline_write_batch( entity_df=None, ) - @staticmethod - def init_client(config): - location = f"grpc://{config.offline_store.host}:{config.offline_store.port}" - client = fl.connect(location=location) - logger.info(f"Connecting FlightClient at {location}") - return client - def _create_retrieval_metadata(feature_refs: List[str], entity_df: pd.DataFrame): entity_schema = _get_entity_schema( @@ -331,11 +340,20 @@ def _send_retrieve_remote( table: pa.Table, client: fl.FlightClient, ): - command_descriptor = _call_put(api, api_parameters, client, entity_df, table) + command_descriptor = _call_put( + api, + api_parameters, + client, + entity_df, + table, + ) return _call_get(client, command_descriptor) -def _call_get(client: fl.FlightClient, command_descriptor: fl.FlightDescriptor): +def _call_get( + client: fl.FlightClient, + command_descriptor: fl.FlightDescriptor, +): flight = client.get_flight_info(command_descriptor) ticket = flight.endpoints[0].ticket reader = client.do_get(ticket) @@ -384,10 +402,7 @@ def _put_parameters( else: updatedTable = _create_empty_table() - writer, _ = client.do_put( - command_descriptor, - updatedTable.schema, - ) + writer, _ = client.do_put(command_descriptor, updatedTable.schema) writer.write_table(updatedTable) writer.close() diff --git a/sdk/python/feast/infra/online_stores/remote.py b/sdk/python/feast/infra/online_stores/remote.py index 19e1b7d515..93fbcaf771 100644 --- a/sdk/python/feast/infra/online_stores/remote.py +++ b/sdk/python/feast/infra/online_stores/remote.py @@ -16,11 +16,13 @@ from datetime import datetime from typing import Any, Callable, Dict, List, Literal, Optional, Sequence, Tuple -import requests from pydantic import StrictStr from feast import Entity, FeatureView, RepoConfig from feast.infra.online_stores.online_store import OnlineStore +from feast.permissions.client.http_auth_requests_wrapper import ( + get_http_auth_requests_session, +) from feast.protos.feast.types.EntityKey_pb2 import EntityKey as EntityKeyProto from feast.protos.feast.types.Value_pb2 import Value as ValueProto from feast.repo_config import FeastConfigBaseModel @@ -70,7 +72,7 @@ def online_read( req_body = self._construct_online_read_api_json_request( entity_keys, table, requested_features ) - response = requests.post( + response = get_http_auth_requests_session(config.auth_config).post( f"{config.online_store.path}/get-online-features", data=req_body ) if response.status_code == 200: diff --git a/sdk/python/feast/infra/registry/base_registry.py b/sdk/python/feast/infra/registry/base_registry.py index 03bec64830..33adb6b7c9 100644 --- a/sdk/python/feast/infra/registry/base_registry.py +++ b/sdk/python/feast/infra/registry/base_registry.py @@ -28,6 +28,7 @@ from feast.feature_view import FeatureView from feast.infra.infra_object import Infra from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.protos.feast.core.Entity_pb2 import Entity as EntityProto from feast.protos.feast.core.FeatureService_pb2 import ( @@ -37,6 +38,7 @@ from feast.protos.feast.core.OnDemandFeatureView_pb2 import ( OnDemandFeatureView as OnDemandFeatureViewProto, ) +from feast.protos.feast.core.Permission_pb2 import Permission as PermissionProto from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto from feast.protos.feast.core.SavedDataset_pb2 import SavedDataset as SavedDatasetProto from feast.protos.feast.core.StreamFeatureView_pb2 import ( @@ -457,7 +459,10 @@ def delete_saved_dataset(self, name: str, project: str, commit: bool = True): @abstractmethod def list_saved_datasets( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[SavedDataset]: """ Retrieves a list of all saved datasets in specified project @@ -465,6 +470,7 @@ def list_saved_datasets( Args: project: Feast project allow_cache: Whether to allow returning this dataset from a cached registry + tags: Filter by tags Returns: Returns the list of SavedDatasets @@ -521,17 +527,21 @@ def get_validation_reference( # TODO: Needs to be implemented. def list_validation_references( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[ValidationReference]: """ Retrieve a list of validation references from the registry Args: - allow_cache: Allow returning feature views from the cached registry - project: Filter feature views based on project name + project: Filter validation references based on project name + allow_cache: Allow returning validation references from the cached registry + tags: Filter by tags Returns: - List of request feature views + List of request validation references """ raise NotImplementedError @@ -590,6 +600,69 @@ def get_user_metadata( self, project: str, feature_view: BaseFeatureView ) -> Optional[bytes]: ... + # Permission operations + @abstractmethod + def apply_permission( + self, permission: Permission, project: str, commit: bool = True + ): + """ + Registers a single permission with Feast + + Args: + permission: A permission that will be registered + project: Feast project that this permission belongs to + commit: Whether to immediately commit to the registry + """ + raise NotImplementedError + + @abstractmethod + def delete_permission(self, name: str, project: str, commit: bool = True): + """ + Deletes a permission or raises an exception if not found. + + Args: + name: Name of permission + project: Feast project that this permission belongs to + commit: Whether the change should be persisted immediately + """ + raise NotImplementedError + + @abstractmethod + def get_permission( + self, name: str, project: str, allow_cache: bool = False + ) -> Permission: + """ + Retrieves a permission. + + Args: + name: Name of permission + project: Feast project that this permission belongs to + allow_cache: Whether to allow returning this permission from a cached registry + + Returns: + Returns either the specified permission, or raises an exception if none is found + """ + raise NotImplementedError + + @abstractmethod + def list_permissions( + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, + ) -> List[Permission]: + """ + Retrieve a list of permissions from the registry + + Args: + project: Filter permission based on project name + allow_cache: Whether to allow returning permissions from a cached registry + + Returns: + List of permissions + """ + raise NotImplementedError + @abstractmethod def proto(self) -> RegistryProto: """ @@ -716,6 +789,13 @@ def to_dict(self, project: str) -> Dict[str, List[Any]]: registry_dict["infra"].append( self._message_to_sorted_dict(infra_object.to_proto()) ) + for permission in sorted( + self.list_permissions(project=project), key=lambda ds: ds.name + ): + registry_dict["permissions"].append( + self._message_to_sorted_dict(permission.to_proto()) + ) + return registry_dict @staticmethod @@ -732,4 +812,6 @@ def deserialize_registry_values(serialized_proto, feast_obj_type) -> Any: return OnDemandFeatureViewProto.FromString(serialized_proto) if feast_obj_type == FeatureService: return FeatureServiceProto.FromString(serialized_proto) + if feast_obj_type == Permission: + return PermissionProto.FromString(serialized_proto) return None diff --git a/sdk/python/feast/infra/registry/caching_registry.py b/sdk/python/feast/infra/registry/caching_registry.py index 298639028d..611d67de96 100644 --- a/sdk/python/feast/infra/registry/caching_registry.py +++ b/sdk/python/feast/infra/registry/caching_registry.py @@ -14,6 +14,7 @@ from feast.infra.registry import proto_registry_utils from feast.infra.registry.base_registry import BaseRegistry from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.saved_dataset import SavedDataset, ValidationReference from feast.stream_feature_view import StreamFeatureView @@ -249,18 +250,23 @@ def get_saved_dataset( return self._get_saved_dataset(name, project) @abstractmethod - def _list_saved_datasets(self, project: str) -> List[SavedDataset]: + def _list_saved_datasets( + self, project: str, tags: Optional[dict[str, str]] = None + ) -> List[SavedDataset]: pass def list_saved_datasets( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[SavedDataset]: if allow_cache: self._refresh_cached_registry_if_necessary() return proto_registry_utils.list_saved_datasets( - self.cached_registry_proto, project + self.cached_registry_proto, project, tags ) - return self._list_saved_datasets(project) + return self._list_saved_datasets(project, tags) @abstractmethod def _get_validation_reference(self, name: str, project: str) -> ValidationReference: @@ -277,18 +283,23 @@ def get_validation_reference( return self._get_validation_reference(name, project) @abstractmethod - def _list_validation_references(self, project: str) -> List[ValidationReference]: + def _list_validation_references( + self, project: str, tags: Optional[dict[str, str]] = None + ) -> List[ValidationReference]: pass def list_validation_references( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[ValidationReference]: if allow_cache: self._refresh_cached_registry_if_necessary() return proto_registry_utils.list_validation_references( - self.cached_registry_proto, project + self.cached_registry_proto, project, tags ) - return self._list_validation_references(project) + return self._list_validation_references(project, tags) @abstractmethod def _list_project_metadata(self, project: str) -> List[ProjectMetadata]: @@ -311,6 +322,39 @@ def _get_infra(self, project: str) -> Infra: def get_infra(self, project: str, allow_cache: bool = False) -> Infra: return self._get_infra(project) + @abstractmethod + def _get_permission(self, name: str, project: str) -> Permission: + pass + + def get_permission( + self, name: str, project: str, allow_cache: bool = False + ) -> Permission: + if allow_cache: + self._refresh_cached_registry_if_necessary() + return proto_registry_utils.get_permission( + self.cached_registry_proto, name, project + ) + return self._get_permission(name, project) + + @abstractmethod + def _list_permissions( + self, project: str, tags: Optional[dict[str, str]] + ) -> List[Permission]: + pass + + def list_permissions( + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, + ) -> List[Permission]: + if allow_cache: + self._refresh_cached_registry_if_necessary() + return proto_registry_utils.list_permissions( + self.cached_registry_proto, project, tags + ) + return self._list_permissions(project, tags) + def refresh(self, project: Optional[str] = None): if project: project_metadata = proto_registry_utils.get_project_metadata( diff --git a/sdk/python/feast/infra/registry/proto_registry_utils.py b/sdk/python/feast/infra/registry/proto_registry_utils.py index 0e85f5b0a9..f67808aab5 100644 --- a/sdk/python/feast/infra/registry/proto_registry_utils.py +++ b/sdk/python/feast/infra/registry/proto_registry_utils.py @@ -10,12 +10,14 @@ EntityNotFoundException, FeatureServiceNotFoundException, FeatureViewNotFoundException, + PermissionObjectNotFoundException, SavedDatasetNotFound, ValidationReferenceNotFound, ) from feast.feature_service import FeatureService from feast.feature_view import FeatureView from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.protos.feast.core.Registry_pb2 import ProjectMetadata as ProjectMetadataProto from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto @@ -252,24 +254,28 @@ def list_data_sources( return data_sources -@registry_proto_cache +@registry_proto_cache_with_tags def list_saved_datasets( - registry_proto: RegistryProto, project: str + registry_proto: RegistryProto, project: str, tags: Optional[dict[str, str]] ) -> List[SavedDataset]: saved_datasets = [] for saved_dataset in registry_proto.saved_datasets: - if saved_dataset.spec.project == project: + if saved_dataset.spec.project == project and utils.has_all_tags( + saved_dataset.tags, tags + ): saved_datasets.append(SavedDataset.from_proto(saved_dataset)) return saved_datasets -@registry_proto_cache +@registry_proto_cache_with_tags def list_validation_references( - registry_proto: RegistryProto, project: str + registry_proto: RegistryProto, project: str, tags: Optional[dict[str, str]] ) -> List[ValidationReference]: validation_references = [] for validation_reference in registry_proto.validation_references: - if validation_reference.project == project: + if validation_reference.project == project and utils.has_all_tags( + validation_reference.tags, tags + ): validation_references.append( ValidationReference.from_proto(validation_reference) ) @@ -285,3 +291,28 @@ def list_project_metadata( for project_metadata in registry_proto.project_metadata if project_metadata.project == project ] + + +@registry_proto_cache_with_tags +def list_permissions( + registry_proto: RegistryProto, project: str, tags: Optional[dict[str, str]] +) -> List[Permission]: + permissions = [] + for permission_proto in registry_proto.permissions: + if permission_proto.spec.project == project and utils.has_all_tags( + permission_proto.spec.tags, tags + ): + permissions.append(Permission.from_proto(permission_proto)) + return permissions + + +def get_permission( + registry_proto: RegistryProto, name: str, project: str +) -> Permission: + for permission_proto in registry_proto.permissions: + if ( + permission_proto.spec.project == project + and permission_proto.spec.name == name + ): + return Permission.from_proto(permission_proto) + raise PermissionObjectNotFoundException(name=name, project=project) diff --git a/sdk/python/feast/infra/registry/registry.py b/sdk/python/feast/infra/registry/registry.py index fe44e6253a..366f3aacaa 100644 --- a/sdk/python/feast/infra/registry/registry.py +++ b/sdk/python/feast/infra/registry/registry.py @@ -31,6 +31,7 @@ EntityNotFoundException, FeatureServiceNotFoundException, FeatureViewNotFoundException, + PermissionNotFoundException, ValidationReferenceNotFound, ) from feast.feature_service import FeatureService @@ -41,6 +42,8 @@ from feast.infra.registry.base_registry import BaseRegistry from feast.infra.registry.registry_store import NoopRegistryStore from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.auth_model import AuthConfig, NoAuthConfig +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto from feast.repo_config import RegistryConfig @@ -73,6 +76,7 @@ class FeastObjectType(Enum): ON_DEMAND_FEATURE_VIEW = "on demand feature view" STREAM_FEATURE_VIEW = "stream feature view" FEATURE_SERVICE = "feature service" + PERMISSION = "permission" @staticmethod def get_objects_from_registry( @@ -91,6 +95,7 @@ def get_objects_from_registry( FeastObjectType.FEATURE_SERVICE: registry.list_feature_services( project=project ), + FeastObjectType.PERMISSION: registry.list_permissions(project=project), } @staticmethod @@ -104,6 +109,7 @@ def get_objects_from_repo_contents( FeastObjectType.ON_DEMAND_FEATURE_VIEW: repo_contents.on_demand_feature_views, FeastObjectType.STREAM_FEATURE_VIEW: repo_contents.stream_feature_views, FeastObjectType.FEATURE_SERVICE: repo_contents.feature_services, + FeastObjectType.PERMISSION: repo_contents.permissions, } @@ -160,6 +166,7 @@ def __new__( project: str, registry_config: Optional[RegistryConfig], repo_path: Optional[Path], + auth_config: AuthConfig = NoAuthConfig(), ): # We override __new__ so that we can inspect registry_config and create a SqlRegistry without callers # needing to make any changes. @@ -174,7 +181,7 @@ def __new__( elif registry_config and registry_config.registry_type == "remote": from feast.infra.registry.remote import RemoteRegistry - return RemoteRegistry(registry_config, project, repo_path) + return RemoteRegistry(registry_config, project, repo_path, auth_config) else: return super(Registry, cls).__new__(cls) @@ -183,6 +190,7 @@ def __init__( project: str, registry_config: Optional[RegistryConfig], repo_path: Optional[Path], + auth_config: AuthConfig = NoAuthConfig(), ): """ Create the Registry object. @@ -194,6 +202,7 @@ def __init__( """ self._refresh_lock = Lock() + self._auth_config = auth_config if registry_config: registry_store_type = registry_config.registry_store_type @@ -211,7 +220,7 @@ def __init__( ) def clone(self) -> "Registry": - new_registry = Registry("project", None, None) + new_registry = Registry("project", None, None, self._auth_config) new_registry.cached_registry_proto_ttl = timedelta(seconds=0) new_registry.cached_registry_proto = ( self.cached_registry_proto.__deepcopy__() @@ -307,9 +316,6 @@ def apply_data_source( if existing_data_source_proto.name == data_source.name: del registry.data_sources[idx] data_source_proto = data_source.to_proto() - data_source_proto.data_source_class_type = ( - f"{data_source.__class__.__module__}.{data_source.__class__.__name__}" - ) data_source_proto.project = project data_source_proto.data_source_class_type = ( f"{data_source.__class__.__module__}.{data_source.__class__.__name__}" @@ -709,12 +715,15 @@ def get_saved_dataset( return proto_registry_utils.get_saved_dataset(registry_proto, name, project) def list_saved_datasets( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[SavedDataset]: registry_proto = self._get_registry_proto( project=project, allow_cache=allow_cache ) - return proto_registry_utils.list_saved_datasets(registry_proto, project) + return proto_registry_utils.list_saved_datasets(registry_proto, project, tags) def apply_validation_reference( self, @@ -751,12 +760,17 @@ def get_validation_reference( ) def list_validation_references( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[ValidationReference]: registry_proto = self._get_registry_proto( project=project, allow_cache=allow_cache ) - return proto_registry_utils.list_validation_references(registry_proto, project) + return proto_registry_utils.list_validation_references( + registry_proto, project, tags + ) def delete_validation_reference(self, name: str, project: str, commit: bool = True): registry_proto = self._prepare_registry_for_changes(project) @@ -905,3 +919,62 @@ def _existing_feature_view_names_to_fvs(self) -> Dict[str, Message]: fv.spec.name: fv for fv in self.cached_registry_proto.stream_feature_views } return {**odfvs, **fvs, **sfv} + + def get_permission( + self, name: str, project: str, allow_cache: bool = False + ) -> Permission: + registry_proto = self._get_registry_proto( + project=project, allow_cache=allow_cache + ) + return proto_registry_utils.get_permission(registry_proto, name, project) + + def list_permissions( + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, + ) -> List[Permission]: + registry_proto = self._get_registry_proto( + project=project, allow_cache=allow_cache + ) + return proto_registry_utils.list_permissions(registry_proto, project, tags) + + def apply_permission( + self, permission: Permission, project: str, commit: bool = True + ): + now = _utc_now() + if not permission.created_timestamp: + permission.created_timestamp = now + permission.last_updated_timestamp = now + + registry = self._prepare_registry_for_changes(project) + for idx, existing_permission_proto in enumerate(registry.permissions): + if ( + existing_permission_proto.spec.name == permission.name + and existing_permission_proto.spec.project == project + ): + permission.created_timestamp = ( + existing_permission_proto.meta.created_timestamp.ToDatetime() + ) + del registry.permissions[idx] + + permission_proto = permission.to_proto() + permission_proto.spec.project = project + registry.permissions.append(permission_proto) + if commit: + self.commit() + + def delete_permission(self, name: str, project: str, commit: bool = True): + self._prepare_registry_for_changes(project) + assert self.cached_registry_proto + + for idx, permission_proto in enumerate(self.cached_registry_proto.permissions): + if ( + permission_proto.spec.name == name + and permission_proto.spec.project == project + ): + del self.cached_registry_proto.permissions[idx] + if commit: + self.commit() + return + raise PermissionNotFoundException(name, project) diff --git a/sdk/python/feast/infra/registry/remote.py b/sdk/python/feast/infra/registry/remote.py index 9fa6d8ebee..618628bc07 100644 --- a/sdk/python/feast/infra/registry/remote.py +++ b/sdk/python/feast/infra/registry/remote.py @@ -15,6 +15,15 @@ from feast.infra.infra_object import Infra from feast.infra.registry.base_registry import BaseRegistry from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.auth.auth_type import AuthType +from feast.permissions.auth_model import ( + AuthConfig, + NoAuthConfig, +) +from feast.permissions.client.grpc_client_auth_interceptor import ( + GrpcClientAuthHeaderInterceptor, +) +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto from feast.protos.feast.registry import RegistryServer_pb2, RegistryServer_pb2_grpc @@ -38,31 +47,32 @@ def __init__( registry_config: Union[RegistryConfig, RemoteRegistryConfig], project: str, repo_path: Optional[Path], + auth_config: AuthConfig = NoAuthConfig(), ): - self.channel = grpc.insecure_channel(registry_config.path) - self.stub = RegistryServer_pb2_grpc.RegistryServerStub(self.channel) + self.auth_config = auth_config + channel = grpc.insecure_channel(registry_config.path) + if self.auth_config.type != AuthType.NONE.value: + auth_header_interceptor = GrpcClientAuthHeaderInterceptor(auth_config) + channel = grpc.intercept_channel(channel, auth_header_interceptor) + self.stub = RegistryServer_pb2_grpc.RegistryServerStub(channel) def apply_entity(self, entity: Entity, project: str, commit: bool = True): request = RegistryServer_pb2.ApplyEntityRequest( entity=entity.to_proto(), project=project, commit=commit ) - self.stub.ApplyEntity(request) def delete_entity(self, name: str, project: str, commit: bool = True): request = RegistryServer_pb2.DeleteEntityRequest( name=name, project=project, commit=commit ) - self.stub.DeleteEntity(request) def get_entity(self, name: str, project: str, allow_cache: bool = False) -> Entity: request = RegistryServer_pb2.GetEntityRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetEntity(request) - return Entity.from_proto(response) def list_entities( @@ -74,9 +84,7 @@ def list_entities( request = RegistryServer_pb2.ListEntitiesRequest( project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListEntities(request) - return [Entity.from_proto(entity) for entity in response.entities] def apply_data_source( @@ -85,14 +93,12 @@ def apply_data_source( request = RegistryServer_pb2.ApplyDataSourceRequest( data_source=data_source.to_proto(), project=project, commit=commit ) - self.stub.ApplyDataSource(request) def delete_data_source(self, name: str, project: str, commit: bool = True): request = RegistryServer_pb2.DeleteDataSourceRequest( name=name, project=project, commit=commit ) - self.stub.DeleteDataSource(request) def get_data_source( @@ -101,9 +107,7 @@ def get_data_source( request = RegistryServer_pb2.GetDataSourceRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetDataSource(request) - return DataSource.from_proto(response) def list_data_sources( @@ -115,9 +119,7 @@ def list_data_sources( request = RegistryServer_pb2.ListDataSourcesRequest( project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListDataSources(request) - return [ DataSource.from_proto(data_source) for data_source in response.data_sources ] @@ -128,14 +130,12 @@ def apply_feature_service( request = RegistryServer_pb2.ApplyFeatureServiceRequest( feature_service=feature_service.to_proto(), project=project, commit=commit ) - self.stub.ApplyFeatureService(request) def delete_feature_service(self, name: str, project: str, commit: bool = True): request = RegistryServer_pb2.DeleteFeatureServiceRequest( name=name, project=project, commit=commit ) - self.stub.DeleteFeatureService(request) def get_feature_service( @@ -144,9 +144,7 @@ def get_feature_service( request = RegistryServer_pb2.GetFeatureServiceRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetFeatureService(request) - return FeatureService.from_proto(response) def list_feature_services( @@ -158,9 +156,7 @@ def list_feature_services( request = RegistryServer_pb2.ListFeatureServicesRequest( project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListFeatureServices(request) - return [ FeatureService.from_proto(feature_service) for feature_service in response.feature_services @@ -196,7 +192,6 @@ def delete_feature_view(self, name: str, project: str, commit: bool = True): request = RegistryServer_pb2.DeleteFeatureViewRequest( name=name, project=project, commit=commit ) - self.stub.DeleteFeatureView(request) def get_stream_feature_view( @@ -205,9 +200,7 @@ def get_stream_feature_view( request = RegistryServer_pb2.GetStreamFeatureViewRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetStreamFeatureView(request) - return StreamFeatureView.from_proto(response) def list_stream_feature_views( @@ -219,9 +212,7 @@ def list_stream_feature_views( request = RegistryServer_pb2.ListStreamFeatureViewsRequest( project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListStreamFeatureViews(request) - return [ StreamFeatureView.from_proto(stream_feature_view) for stream_feature_view in response.stream_feature_views @@ -233,9 +224,7 @@ def get_on_demand_feature_view( request = RegistryServer_pb2.GetOnDemandFeatureViewRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetOnDemandFeatureView(request) - return OnDemandFeatureView.from_proto(response) def list_on_demand_feature_views( @@ -247,9 +236,7 @@ def list_on_demand_feature_views( request = RegistryServer_pb2.ListOnDemandFeatureViewsRequest( project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListOnDemandFeatureViews(request) - return [ OnDemandFeatureView.from_proto(on_demand_feature_view) for on_demand_feature_view in response.on_demand_feature_views @@ -261,9 +248,7 @@ def get_feature_view( request = RegistryServer_pb2.GetFeatureViewRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetFeatureView(request) - return FeatureView.from_proto(response) def list_feature_views( @@ -275,7 +260,6 @@ def list_feature_views( request = RegistryServer_pb2.ListFeatureViewsRequest( project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListFeatureViews(request) return [ @@ -304,7 +288,6 @@ def apply_materialization( end_date=end_date_timestamp, commit=commit, ) - self.stub.ApplyMaterialization(request) def apply_saved_dataset( @@ -316,14 +299,12 @@ def apply_saved_dataset( request = RegistryServer_pb2.ApplySavedDatasetRequest( saved_dataset=saved_dataset.to_proto(), project=project, commit=commit ) - self.stub.ApplyFeatureService(request) def delete_saved_dataset(self, name: str, project: str, commit: bool = True): request = RegistryServer_pb2.DeleteSavedDatasetRequest( name=name, project=project, commit=commit ) - self.stub.DeleteSavedDataset(request) def get_saved_dataset( @@ -332,20 +313,19 @@ def get_saved_dataset( request = RegistryServer_pb2.GetSavedDatasetRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetSavedDataset(request) - return SavedDataset.from_proto(response) def list_saved_datasets( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[SavedDataset]: request = RegistryServer_pb2.ListSavedDatasetsRequest( - project=project, allow_cache=allow_cache + project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListSavedDatasets(request) - return [ SavedDataset.from_proto(saved_dataset) for saved_dataset in response.saved_datasets @@ -362,14 +342,12 @@ def apply_validation_reference( project=project, commit=commit, ) - self.stub.ApplyValidationReference(request) def delete_validation_reference(self, name: str, project: str, commit: bool = True): request = RegistryServer_pb2.DeleteValidationReferenceRequest( name=name, project=project, commit=commit ) - self.stub.DeleteValidationReference(request) def get_validation_reference( @@ -378,20 +356,19 @@ def get_validation_reference( request = RegistryServer_pb2.GetValidationReferenceRequest( name=name, project=project, allow_cache=allow_cache ) - response = self.stub.GetValidationReference(request) - return ValidationReference.from_proto(response) def list_validation_references( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[ValidationReference]: request = RegistryServer_pb2.ListValidationReferencesRequest( - project=project, allow_cache=allow_cache + project=project, allow_cache=allow_cache, tags=tags ) - response = self.stub.ListValidationReferences(request) - return [ ValidationReference.from_proto(validation_reference) for validation_reference in response.validation_references @@ -403,25 +380,20 @@ def list_project_metadata( request = RegistryServer_pb2.ListProjectMetadataRequest( project=project, allow_cache=allow_cache ) - response = self.stub.ListProjectMetadata(request) - return [ProjectMetadata.from_proto(pm) for pm in response.project_metadata] def update_infra(self, infra: Infra, project: str, commit: bool = True): request = RegistryServer_pb2.UpdateInfraRequest( infra=infra.to_proto(), project=project, commit=commit ) - self.stub.UpdateInfra(request) def get_infra(self, project: str, allow_cache: bool = False) -> Infra: request = RegistryServer_pb2.GetInfraRequest( project=project, allow_cache=allow_cache ) - response = self.stub.GetInfra(request) - return Infra.from_proto(response) def apply_user_metadata( @@ -437,6 +409,47 @@ def get_user_metadata( ) -> Optional[bytes]: pass + def apply_permission( + self, permission: Permission, project: str, commit: bool = True + ): + permission_proto = permission.to_proto() + permission_proto.spec.project = project + + request = RegistryServer_pb2.ApplyPermissionRequest( + permission=permission_proto, project=project, commit=commit + ) + self.stub.ApplyPermission(request) + + def delete_permission(self, name: str, project: str, commit: bool = True): + request = RegistryServer_pb2.DeletePermissionRequest( + name=name, project=project, commit=commit + ) + self.stub.DeletePermission(request) + + def get_permission( + self, name: str, project: str, allow_cache: bool = False + ) -> Permission: + request = RegistryServer_pb2.GetPermissionRequest( + name=name, project=project, allow_cache=allow_cache + ) + response = self.stub.GetPermission(request) + + return Permission.from_proto(response) + + def list_permissions( + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, + ) -> List[Permission]: + request = RegistryServer_pb2.ListPermissionsRequest( + project=project, allow_cache=allow_cache, tags=tags + ) + response = self.stub.ListPermissions(request) + return [ + Permission.from_proto(permission) for permission in response.permissions + ] + def proto(self) -> RegistryProto: return self.stub.Proto(Empty()) @@ -445,7 +458,6 @@ def commit(self): def refresh(self, project: Optional[str] = None): request = RegistryServer_pb2.RefreshRequest(project=str(project)) - self.stub.Refresh(request) def teardown(self): diff --git a/sdk/python/feast/infra/registry/snowflake.py b/sdk/python/feast/infra/registry/snowflake.py index ac4f52dc06..801b90afe3 100644 --- a/sdk/python/feast/infra/registry/snowflake.py +++ b/sdk/python/feast/infra/registry/snowflake.py @@ -18,6 +18,7 @@ EntityNotFoundException, FeatureServiceNotFoundException, FeatureViewNotFoundException, + PermissionNotFoundException, SavedDatasetNotFound, ValidationReferenceNotFound, ) @@ -31,6 +32,7 @@ execute_snowflake_statement, ) from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.protos.feast.core.DataSource_pb2 import DataSource as DataSourceProto from feast.protos.feast.core.Entity_pb2 import Entity as EntityProto @@ -42,6 +44,7 @@ from feast.protos.feast.core.OnDemandFeatureView_pb2 import ( OnDemandFeatureView as OnDemandFeatureViewProto, ) +from feast.protos.feast.core.Permission_pb2 import Permission as PermissionProto from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto from feast.protos.feast.core.SavedDataset_pb2 import SavedDataset as SavedDatasetProto from feast.protos.feast.core.StreamFeatureView_pb2 import ( @@ -342,6 +345,17 @@ def _apply_object( self._set_last_updated_metadata(update_datetime, project) + def apply_permission( + self, permission: Permission, project: str, commit: bool = True + ): + return self._apply_object( + "PERMISSIONS", + project, + "PERMISSION_NAME", + permission, + "PERMISSION_PROTO", + ) + # delete operations def delete_data_source(self, name: str, project: str, commit: bool = True): return self._delete_object( @@ -421,6 +435,15 @@ def _delete_object( return cursor.rowcount + def delete_permission(self, name: str, project: str, commit: bool = True): + return self._delete_object( + "PERMISSIONS", + name, + project, + "PERMISSION_NAME", + PermissionNotFoundException, + ) + # get operations def get_data_source( self, name: str, project: str, allow_cache: bool = False @@ -619,6 +642,25 @@ def _get_object( else: return None + def get_permission( + self, name: str, project: str, allow_cache: bool = False + ) -> Permission: + if allow_cache: + self._refresh_cached_registry_if_necessary() + return proto_registry_utils.get_permission( + self.cached_registry_proto, name, project + ) + return self._get_object( + "PERMISSIONS", + name, + project, + PermissionProto, + Permission, + "PERMISSION_NAME", + "PERMISSION_PROTO", + PermissionNotFoundException, + ) + # list operations def list_data_sources( self, @@ -716,12 +758,15 @@ def list_on_demand_feature_views( ) def list_saved_datasets( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[SavedDataset]: if allow_cache: self._refresh_cached_registry_if_necessary() return proto_registry_utils.list_saved_datasets( - self.cached_registry_proto, project + self.cached_registry_proto, project, tags ) return self._list_objects( "SAVED_DATASETS", @@ -729,6 +774,7 @@ def list_saved_datasets( SavedDatasetProto, SavedDataset, "SAVED_DATASET_PROTO", + tags=tags, ) def list_stream_feature_views( @@ -752,7 +798,10 @@ def list_stream_feature_views( ) def list_validation_references( - self, project: str, allow_cache: bool = False + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, ) -> List[ValidationReference]: return self._list_objects( "VALIDATION_REFERENCES", @@ -760,6 +809,7 @@ def list_validation_references( ValidationReferenceProto, ValidationReference, "VALIDATION_REFERENCE_PROTO", + tags=tags, ) def _list_objects( @@ -793,6 +843,26 @@ def _list_objects( return objects return [] + def list_permissions( + self, + project: str, + allow_cache: bool = False, + tags: Optional[dict[str, str]] = None, + ) -> List[Permission]: + if allow_cache: + self._refresh_cached_registry_if_necessary() + return proto_registry_utils.list_permissions( + self.cached_registry_proto, project + ) + return self._list_objects( + "PERMISSIONS", + project, + PermissionProto, + Permission, + "PERMISSION_PROTO", + tags, + ) + def apply_materialization( self, feature_view: FeatureView, @@ -934,6 +1004,7 @@ def proto(self) -> RegistryProto: (self.list_saved_datasets, r.saved_datasets), (self.list_validation_references, r.validation_references), (self.list_project_metadata, r.project_metadata), + (self.list_permissions, r.permissions), ]: objs: List[Any] = lister(project) # type: ignore if objs: @@ -964,6 +1035,7 @@ def _get_all_projects(self) -> Set[str]: "FEATURE_VIEWS", "ON_DEMAND_FEATURE_VIEWS", "STREAM_FEATURE_VIEWS", + "PERMISSIONS", ] with GetSnowflakeConnection(self.registry_config) as conn: diff --git a/sdk/python/feast/infra/registry/sql.py b/sdk/python/feast/infra/registry/sql.py index a2b16a3a09..90c6e82e7d 100644 --- a/sdk/python/feast/infra/registry/sql.py +++ b/sdk/python/feast/infra/registry/sql.py @@ -30,6 +30,7 @@ EntityNotFoundException, FeatureServiceNotFoundException, FeatureViewNotFoundException, + PermissionNotFoundException, SavedDatasetNotFound, ValidationReferenceNotFound, ) @@ -38,6 +39,7 @@ from feast.infra.infra_object import Infra from feast.infra.registry.caching_registry import CachingRegistry from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.project_metadata import ProjectMetadata from feast.protos.feast.core.DataSource_pb2 import DataSource as DataSourceProto from feast.protos.feast.core.Entity_pb2 import Entity as EntityProto @@ -49,6 +51,7 @@ from feast.protos.feast.core.OnDemandFeatureView_pb2 import ( OnDemandFeatureView as OnDemandFeatureViewProto, ) +from feast.protos.feast.core.Permission_pb2 import Permission as PermissionProto from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto from feast.protos.feast.core.SavedDataset_pb2 import SavedDataset as SavedDatasetProto from feast.protos.feast.core.StreamFeatureView_pb2 import ( @@ -149,6 +152,15 @@ Column("infra_proto", LargeBinary, nullable=False), ) +permissions = Table( + "permissions", + metadata, + Column("permission_name", String(255), primary_key=True), + Column("project_id", String(50), primary_key=True), + Column("last_updated_timestamp", BigInteger, nullable=False), + Column("permission_proto", LargeBinary, nullable=False), +) + class FeastMetadataKeys(Enum): LAST_UPDATED_TIMESTAMP = "last_updated_timestamp" @@ -207,6 +219,7 @@ def teardown(self): on_demand_feature_views, saved_datasets, validation_references, + permissions, }: with self.engine.begin() as conn: stmt = delete(t) @@ -319,13 +332,16 @@ def _get_validation_reference(self, name: str, project: str) -> ValidationRefere not_found_exception=ValidationReferenceNotFound, ) - def _list_validation_references(self, project: str) -> List[ValidationReference]: + def _list_validation_references( + self, project: str, tags: Optional[dict[str, str]] = None + ) -> List[ValidationReference]: return self._list_objects( table=validation_references, project=project, proto_class=ValidationReferenceProto, python_class=ValidationReference, proto_field_name="validation_reference_proto", + tags=tags, ) def _list_entities( @@ -447,13 +463,16 @@ def _list_feature_views( tags=tags, ) - def _list_saved_datasets(self, project: str) -> List[SavedDataset]: + def _list_saved_datasets( + self, project: str, tags: Optional[dict[str, str]] = None + ) -> List[SavedDataset]: return self._list_objects( saved_datasets, project, SavedDatasetProto, SavedDataset, "saved_dataset_proto", + tags=tags, ) def _list_on_demand_feature_views( @@ -666,6 +685,7 @@ def proto(self) -> RegistryProto: (self.list_saved_datasets, r.saved_datasets), (self.list_validation_references, r.validation_references), (self.list_project_metadata, r.project_metadata), + (self.list_permissions, r.permissions), ]: objs: List[Any] = lister(project) # type: ignore if objs: @@ -721,6 +741,7 @@ def _apply_object( "saved_dataset_proto", "feature_view_proto", "feature_service_proto", + "permission_proto", ]: deserialized_proto = self.deserialize_registry_values( row._mapping[proto_field_name], type(obj) @@ -917,6 +938,7 @@ def _get_all_projects(self) -> Set[str]: feature_views, on_demand_feature_views, stream_feature_views, + permissions, }: stmt = select(table) rows = conn.execute(stmt).all() @@ -924,3 +946,44 @@ def _get_all_projects(self) -> Set[str]: projects.add(row._mapping["project_id"]) return projects + + def _get_permission(self, name: str, project: str) -> Permission: + return self._get_object( + table=permissions, + name=name, + project=project, + proto_class=PermissionProto, + python_class=Permission, + id_field_name="permission_name", + proto_field_name="permission_proto", + not_found_exception=PermissionNotFoundException, + ) + + def _list_permissions( + self, project: str, tags: Optional[dict[str, str]] + ) -> List[Permission]: + return self._list_objects( + permissions, + project, + PermissionProto, + Permission, + "permission_proto", + tags=tags, + ) + + def apply_permission( + self, permission: Permission, project: str, commit: bool = True + ): + return self._apply_object( + permissions, project, "permission_name", permission, "permission_proto" + ) + + def delete_permission(self, name: str, project: str, commit: bool = True): + with self.engine.begin() as conn: + stmt = delete(permissions).where( + permissions.c.permission_name == name, + permissions.c.project_id == project, + ) + rows = conn.execute(stmt) + if rows.rowcount < 1: + raise PermissionNotFoundException(name, project) diff --git a/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_creation.sql b/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_creation.sql index aa35caeac4..021d175b4e 100644 --- a/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_creation.sql +++ b/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_creation.sql @@ -80,4 +80,12 @@ CREATE TABLE IF NOT EXISTS REGISTRY_PATH."VALIDATION_REFERENCES" ( last_updated_timestamp TIMESTAMP_LTZ NOT NULL, validation_reference_proto BINARY NOT NULL, PRIMARY KEY (validation_reference_name, project_id) -) +); + +CREATE TABLE IF NOT EXISTS REGISTRY_PATH."PERMISSIONS" ( + permission_name VARCHAR, + project_id VARCHAR, + last_updated_timestamp TIMESTAMP_LTZ NOT NULL, + permission_proto BINARY NOT NULL, + PRIMARY KEY (permission_name, project_id) +); diff --git a/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_deletion.sql b/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_deletion.sql index a355c72062..780424abd1 100644 --- a/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_deletion.sql +++ b/sdk/python/feast/infra/utils/snowflake/registry/snowflake_table_deletion.sql @@ -17,3 +17,5 @@ DROP TABLE IF EXISTS REGISTRY_PATH."SAVED_DATASETS"; DROP TABLE IF EXISTS REGISTRY_PATH."STREAM_FEATURE_VIEWS"; DROP TABLE IF EXISTS REGISTRY_PATH."VALIDATION_REFERENCES" + +DROP TABLE IF EXISTS REGISTRY_PATH."PERMISSIONS" diff --git a/sdk/python/feast/offline_server.py b/sdk/python/feast/offline_server.py index be92620d68..839acada93 100644 --- a/sdk/python/feast/offline_server.py +++ b/sdk/python/feast/offline_server.py @@ -3,7 +3,7 @@ import logging import traceback from datetime import datetime -from typing import Any, Dict, List +from typing import Any, Dict, List, cast import pyarrow as pa import pyarrow.flight as fl @@ -12,14 +12,33 @@ from feast.feature_logging import FeatureServiceLoggingSource from feast.feature_view import DUMMY_ENTITY_NAME from feast.infra.offline_stores.offline_utils import get_offline_store_from_config +from feast.permissions.action import AuthzedAction +from feast.permissions.security_manager import assert_permissions +from feast.permissions.server.arrow import ( + arrowflight_middleware, + inject_user_details_decorator, +) +from feast.permissions.server.utils import ( + ServerType, + init_auth_manager, + init_security_manager, + str_to_auth_manager_type, +) from feast.saved_dataset import SavedDatasetStorage logger = logging.getLogger(__name__) +logger.setLevel(logging.INFO) class OfflineServer(fl.FlightServerBase): def __init__(self, store: FeatureStore, location: str, **kwargs): - super(OfflineServer, self).__init__(location, **kwargs) + super(OfflineServer, self).__init__( + location, + middleware=arrowflight_middleware( + str_to_auth_manager_type(store.config.auth_config.type) + ), + **kwargs, + ) self._location = location # A dictionary of configured flights, e.g. API calls received and not yet served self.flights: Dict[str, Any] = {} @@ -41,6 +60,7 @@ def _make_flight_info(self, key: Any, descriptor: fl.FlightDescriptor): return fl.FlightInfo(schema, descriptor, endpoints, -1, -1) + @inject_user_details_decorator def get_flight_info( self, context: fl.ServerCallContext, descriptor: fl.FlightDescriptor ): @@ -49,6 +69,7 @@ def get_flight_info( return self._make_flight_info(key, descriptor) raise KeyError("Flight not found.") + @inject_user_details_decorator def list_flights(self, context: fl.ServerCallContext, criteria: bytes): for key, table in self.flights.items(): if key[1] is not None: @@ -60,6 +81,7 @@ def list_flights(self, context: fl.ServerCallContext, criteria: bytes): # Expects to receive request parameters and stores them in the flights dictionary # Indexed by the unique command + @inject_user_details_decorator def do_put( self, context: fl.ServerCallContext, @@ -156,6 +178,7 @@ def _validate_do_get_parameters(self, command: dict): # Extracts the API parameters from the flights dictionary, delegates the execution to the FeatureStore instance # and returns the stream of data + @inject_user_details_decorator def do_get(self, context: fl.ServerCallContext, ticket: fl.Ticket): key = ast.literal_eval(ticket.ticket.decode()) if key not in self.flights: @@ -217,7 +240,15 @@ def offline_write_batch(self, command: dict, key: str): assert len(feature_views) == 1, "incorrect feature view" table = self.flights[key] self.offline_store.offline_write_batch( - self.store.config, feature_views[0], table, command["progress"] + self.store.config, + cast( + FeatureView, + assert_permissions( + feature_views[0], actions=[AuthzedAction.WRITE_OFFLINE] + ), + ), + table, + command["progress"], ) def _validate_write_logged_features_parameters(self, command: dict): @@ -234,6 +265,10 @@ def write_logged_features(self, command: dict, key: str): feature_service.logging_config is not None ), "feature service must have logging_config set" + assert_permissions( + resource=feature_service, + actions=[AuthzedAction.WRITE_OFFLINE], + ) self.offline_store.write_logged_features( config=self.store.config, data=table, @@ -260,10 +295,12 @@ def _validate_pull_all_from_table_or_query_parameters(self, command: dict): def pull_all_from_table_or_query(self, command: dict): self._validate_pull_all_from_table_or_query_parameters(command) + data_source = self.store.get_data_source(command["data_source_name"]) + assert_permissions(data_source, actions=[AuthzedAction.READ_OFFLINE]) return self.offline_store.pull_all_from_table_or_query( self.store.config, - self.store.get_data_source(command["data_source_name"]), + data_source, command["join_key_columns"], command["feature_name_columns"], command["timestamp_field"], @@ -287,10 +324,11 @@ def _validate_pull_latest_from_table_or_query_parameters(self, command: dict): def pull_latest_from_table_or_query(self, command: dict): self._validate_pull_latest_from_table_or_query_parameters(command) - + data_source = self.store.get_data_source(command["data_source_name"]) + assert_permissions(resource=data_source, actions=[AuthzedAction.READ_OFFLINE]) return self.offline_store.pull_latest_from_table_or_query( self.store.config, - self.store.get_data_source(command["data_source_name"]), + data_source, command["join_key_columns"], command["feature_name_columns"], command["timestamp_field"], @@ -343,6 +381,11 @@ def get_historical_features(self, command: dict, key: str): project=project, ) + for feature_view in feature_views: + assert_permissions( + resource=feature_view, actions=[AuthzedAction.READ_OFFLINE] + ) + retJob = self.offline_store.get_historical_features( config=self.store.config, feature_views=feature_views, @@ -377,6 +420,10 @@ def persist(self, command: dict, key: str): raise NotImplementedError data_source = self.store.get_data_source(command["data_source_name"]) + assert_permissions( + resource=data_source, + actions=[AuthzedAction.WRITE_OFFLINE], + ) storage = SavedDatasetStorage.from_data_source(data_source) ret_job.persist(storage, command["allow_overwrite"], command["timeout"]) except Exception as e: @@ -401,11 +448,23 @@ def remove_dummies(fv: FeatureView) -> FeatureView: return fv +def _init_auth_manager(store: FeatureStore): + auth_type = str_to_auth_manager_type(store.config.auth_config.type) + init_security_manager(auth_type=auth_type, fs=store) + init_auth_manager( + auth_type=auth_type, + server_type=ServerType.ARROW, + auth_config=store.config.auth_config, + ) + + def start_server( store: FeatureStore, host: str, port: int, ): + _init_auth_manager(store) + location = "grpc+tcp://{}:{}".format(host, port) server = OfflineServer(store, location) logger.info(f"Offline store server serving on {location}") diff --git a/sdk/python/feast/permissions/__init__.py b/sdk/python/feast/permissions/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sdk/python/feast/permissions/action.py b/sdk/python/feast/permissions/action.py new file mode 100644 index 0000000000..0e85c1685f --- /dev/null +++ b/sdk/python/feast/permissions/action.py @@ -0,0 +1,40 @@ +import enum + + +class AuthzedAction(enum.Enum): + """ + Identify the type of action being secured by the permissions framework, according to the familiar CRUD and Feast terminology. + """ + + CREATE = "create" # Create an instance + DESCRIBE = "describe" # Access the instance state + UPDATE = "update" # Update the instance state + DELETE = "delete" # Delete an instance + READ_ONLINE = "read_online" # Read the online store only + READ_OFFLINE = "read_offline" # Read the offline store only + WRITE_ONLINE = "write_online" # Write to the online store only + WRITE_OFFLINE = "write_offline" # Write to the offline store only + + +# Alias for all available actions +ALL_ACTIONS = [a for a in AuthzedAction.__members__.values()] + +# Alias for all read actions +READ = [ + AuthzedAction.READ_OFFLINE, + AuthzedAction.READ_ONLINE, +] +# Alias for all write actions +WRITE = [ + AuthzedAction.WRITE_OFFLINE, + AuthzedAction.WRITE_ONLINE, +] + + +# Alias for CRUD actions +CRUD = [ + AuthzedAction.CREATE, + AuthzedAction.DESCRIBE, + AuthzedAction.UPDATE, + AuthzedAction.DELETE, +] diff --git a/sdk/python/feast/permissions/auth/__init__.py b/sdk/python/feast/permissions/auth/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sdk/python/feast/permissions/auth/auth_manager.py b/sdk/python/feast/permissions/auth/auth_manager.py new file mode 100644 index 0000000000..e608904567 --- /dev/null +++ b/sdk/python/feast/permissions/auth/auth_manager.py @@ -0,0 +1,68 @@ +from abc import ABC +from typing import Optional + +from .token_extractor import NoAuthTokenExtractor, TokenExtractor +from .token_parser import NoAuthTokenParser, TokenParser + + +class AuthManager(ABC): + """ + The authorization manager offers services to manage authorization tokens from client requests + to extract user details before injecting them in the security context. + """ + + _token_parser: TokenParser + _token_extractor: TokenExtractor + + def __init__(self, token_parser: TokenParser, token_extractor: TokenExtractor): + self._token_parser = token_parser + self._token_extractor = token_extractor + + @property + def token_parser(self) -> TokenParser: + return self._token_parser + + @property + def token_extractor(self) -> TokenExtractor: + return self._token_extractor + + +""" +The possibly empty global instance of `AuthManager`. +""" +_auth_manager: Optional[AuthManager] = None + + +def get_auth_manager() -> AuthManager: + """ + Return the global instance of `AuthManager`. + + Raises: + RuntimeError if the clobal instance is not set. + """ + global _auth_manager + if _auth_manager is None: + raise RuntimeError( + "AuthManager is not initialized. Call 'set_auth_manager' first." + ) + return _auth_manager + + +def set_auth_manager(auth_manager: AuthManager): + """ + Initialize the global instance of `AuthManager`. + """ + + global _auth_manager + _auth_manager = auth_manager + + +class AllowAll(AuthManager): + """ + An AuthManager not extracting nor parsing the authorization token. + """ + + def __init__(self): + super().__init__( + token_extractor=NoAuthTokenExtractor(), token_parser=NoAuthTokenParser() + ) diff --git a/sdk/python/feast/permissions/auth/auth_type.py b/sdk/python/feast/permissions/auth/auth_type.py new file mode 100644 index 0000000000..3fa34f97bd --- /dev/null +++ b/sdk/python/feast/permissions/auth/auth_type.py @@ -0,0 +1,11 @@ +import enum + + +class AuthType(enum.Enum): + """ + Identify the type of authorization. + """ + + NONE = "no_auth" + OIDC = "oidc" + KUBERNETES = "kubernetes" diff --git a/sdk/python/feast/permissions/auth/kubernetes_token_parser.py b/sdk/python/feast/permissions/auth/kubernetes_token_parser.py new file mode 100644 index 0000000000..c16e5232fb --- /dev/null +++ b/sdk/python/feast/permissions/auth/kubernetes_token_parser.py @@ -0,0 +1,107 @@ +import logging + +import jwt +from kubernetes import client, config +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.auth.token_parser import TokenParser +from feast.permissions.user import User + +logger = logging.getLogger(__name__) + + +class KubernetesTokenParser(TokenParser): + """ + A `TokenParser` implementation to use Kubernetes RBAC resources to retrieve the user details. + The assumption is that the request header includes an authorization bearer with the token of the + client `ServiceAccount`. + By inspecting the role bindings, this `TokenParser` extracts the associated `Role`s. + + The client `ServiceAccount` is instead used as the user name, together with the current namespace. + """ + + def __init__(self): + config.load_incluster_config() + self.v1 = client.CoreV1Api() + self.rbac_v1 = client.RbacAuthorizationV1Api() + + async def user_details_from_access_token(self, access_token: str) -> User: + """ + Extract the service account from the token and search the roles associated with it. + + Returns: + User: Current user, with associated roles. The `username` is the `:` separated concatenation of `namespace` and `service account name`. + + Raises: + AuthenticationError if any error happens. + """ + sa_namespace, sa_name = _decode_token(access_token) + current_user = f"{sa_namespace}:{sa_name}" + logging.info(f"Received request from {sa_name} in {sa_namespace}") + + roles = self.get_roles(sa_namespace, sa_name) + logging.info(f"SA roles are: {roles}") + + return User(username=current_user, roles=roles) + + def get_roles(self, namespace: str, service_account_name: str) -> list[str]: + """ + Fetches the Kubernetes `Role`s associated to the given `ServiceAccount` in the given `namespace`. + + The research also includes the `ClusterRole`s, so the running deployment must be granted enough permissions to query + for such instances in all the namespaces. + + Returns: + list[str]: Name of the `Role`s and `ClusterRole`s associated to the service account. No string manipulation is performed on the role name. + """ + role_bindings = self.rbac_v1.list_namespaced_role_binding(namespace) + cluster_role_bindings = self.rbac_v1.list_cluster_role_binding() + + roles: set[str] = set() + + for binding in role_bindings.items: + if binding.subjects is not None: + for subject in binding.subjects: + if ( + subject.kind == "ServiceAccount" + and subject.name == service_account_name + ): + roles.add(binding.role_ref.name) + + for binding in cluster_role_bindings.items: + if binding.subjects is not None: + for subject in binding.subjects: + if ( + subject.kind == "ServiceAccount" + and subject.name == service_account_name + and subject.namespace == namespace + ): + roles.add(binding.role_ref.name) + + return list(roles) + + +def _decode_token(access_token: str) -> tuple[str, str]: + """ + The `sub` portion of the decoded token includes the service account name in the format: `system:serviceaccount:NAMESPACE:SA_NAME` + + Returns: + str: the namespace name. + str: the `ServiceAccount` name. + """ + try: + decoded_token = jwt.decode(access_token, options={"verify_signature": False}) + if "sub" in decoded_token: + subject: str = decoded_token["sub"] + if len(subject.split(":")) != 4: + raise AuthenticationError( + f"Expecting 4 elements separated by : in th subject section, instead of {len(subject.split(':'))}." + ) + _, _, sa_namespace, sa_name = subject.split(":") + return (sa_namespace, sa_name) + else: + raise AuthenticationError("Missing sub section in received token.") + except jwt.DecodeError as e: + raise AuthenticationError(f"Error decoding JWT token: {e}") diff --git a/sdk/python/feast/permissions/auth/oidc_token_parser.py b/sdk/python/feast/permissions/auth/oidc_token_parser.py new file mode 100644 index 0000000000..921a585bc2 --- /dev/null +++ b/sdk/python/feast/permissions/auth/oidc_token_parser.py @@ -0,0 +1,105 @@ +import logging +from unittest.mock import Mock + +import jwt +from fastapi import Request +from fastapi.security import OAuth2AuthorizationCodeBearer +from jwt import PyJWKClient +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.auth.token_parser import TokenParser +from feast.permissions.auth_model import OidcAuthConfig +from feast.permissions.user import User + +logger = logging.getLogger(__name__) +logger.setLevel(logging.INFO) + + +class OidcTokenParser(TokenParser): + """ + A `TokenParser` to use an OIDC server to retrieve the user details. + Server settings are retrieved from the `auth` configurationof the Feature store. + """ + + _auth_config: OidcAuthConfig + + def __init__(self, auth_config: OidcAuthConfig): + self._auth_config = auth_config + + async def _validate_token(self, access_token: str): + """ + Validate the token extracted from the headrer of the user request against the OAuth2 server. + """ + # FastAPI's OAuth2AuthorizationCodeBearer requires a Request type but actually uses only the headers field + # https://github.com/tiangolo/fastapi/blob/eca465f4c96acc5f6a22e92fd2211675ca8a20c8/fastapi/security/oauth2.py#L380 + request = Mock(spec=Request) + request.headers = {"Authorization": f"Bearer {access_token}"} + + oauth_2_scheme = OAuth2AuthorizationCodeBearer( + tokenUrl=f"{self._auth_config.auth_server_url}/realms/{self._auth_config.realm}/protocol/openid-connect/token", + authorizationUrl=f"{self._auth_config.auth_server_url}/realms/{self._auth_config.realm}/protocol/openid-connect/auth", + refreshUrl=f"{self._auth_config.auth_server_url}/realms/{self._auth_config.realm}/protocol/openid-connect/token", + ) + + await oauth_2_scheme(request=request) + + async def user_details_from_access_token(self, access_token: str) -> User: + """ + Validate the access token then decode it to extract the user credential and roles. + + Returns: + User: Current user, with associated roles. + + Raises: + AuthenticationError if any error happens. + """ + + try: + await self._validate_token(access_token) + logger.info("Validated token") + except Exception as e: + raise AuthenticationError(f"Invalid token: {e}") + + url = f"{self._auth_config.auth_server_url}/realms/{self._auth_config.realm}/protocol/openid-connect/certs" + optional_custom_headers = {"User-agent": "custom-user-agent"} + jwks_client = PyJWKClient(url, headers=optional_custom_headers) + + try: + signing_key = jwks_client.get_signing_key_from_jwt(access_token) + data = jwt.decode( + access_token, + signing_key.key, + algorithms=["RS256"], + audience="account", + options={ + "verify_aud": False, + "verify_signature": True, + "verify_exp": True, + }, + leeway=10, # accepts tokens generated up to 10 seconds in the past, in case of clock skew + ) + + if "preferred_username" not in data: + raise AuthenticationError( + "Missing preferred_username field in access token." + ) + current_user = data["preferred_username"] + + if "resource_access" not in data: + logger.warning("Missing resource_access field in access token.") + client_id = self._auth_config.client_id + if client_id not in data["resource_access"]: + logger.warning( + f"Missing resource_access.{client_id} field in access token. Defaulting to empty roles." + ) + roles = [] + else: + roles = data["resource_access"][client_id]["roles"] + + logger.info(f"Extracted user {current_user} and roles {roles}") + return User(username=current_user, roles=roles) + except jwt.exceptions.InvalidTokenError: + logger.exception("Exception while parsing the token:") + raise AuthenticationError("Invalid token.") diff --git a/sdk/python/feast/permissions/auth/token_extractor.py b/sdk/python/feast/permissions/auth/token_extractor.py new file mode 100644 index 0000000000..37779d7640 --- /dev/null +++ b/sdk/python/feast/permissions/auth/token_extractor.py @@ -0,0 +1,51 @@ +import re +from abc import ABC + +from starlette.authentication import ( + AuthenticationError, +) + + +class TokenExtractor(ABC): + """ + A class to extract the authorization token from a user request. + """ + + def extract_access_token(self, **kwargs) -> str: + """ + Extract the authorization token from a user request. + + The actual implementation has to specify what arguments have to be defined in the kwywork args `kwargs` + + Returns: + The extracted access token. + """ + raise NotImplementedError() + + def _extract_bearer_token(self, auth_header: str) -> str: + """ + Extract the bearer token from the authorization header value. + + Args: + auth_header: The full value of the authorization header. + + Returns: + str: The token value, without the `Bearer` part. + + Raises: + AuthenticationError if the authorization token does not match the `Bearer` scheme. + """ + pattern = r"(?i)Bearer .+" + if not bool(re.match(pattern, auth_header)): + raise AuthenticationError(f"Expected Bearer schema, found {auth_header}") + _, access_token = auth_header.split() + return access_token + + +class NoAuthTokenExtractor(TokenExtractor): + """ + A `TokenExtractor` always returning an empty token + """ + + def extract_access_token(self, **kwargs) -> str: + return "" diff --git a/sdk/python/feast/permissions/auth/token_parser.py b/sdk/python/feast/permissions/auth/token_parser.py new file mode 100644 index 0000000000..f8f2aee44a --- /dev/null +++ b/sdk/python/feast/permissions/auth/token_parser.py @@ -0,0 +1,28 @@ +from abc import ABC, abstractmethod + +from feast.permissions.user import User + + +class TokenParser(ABC): + """ + A class to parse an access token to extract the user credential and roles. + """ + + @abstractmethod + async def user_details_from_access_token(self, access_token: str) -> User: + """ + Parse the access token and return the current user and the list of associated roles. + + Returns: + User: Current user, with associated roles. + """ + raise NotImplementedError() + + +class NoAuthTokenParser(TokenParser): + """ + A `TokenParser` always returning an empty token + """ + + async def user_details_from_access_token(self, access_token: str, **kwargs) -> User: + return User(username="", roles=[]) diff --git a/sdk/python/feast/permissions/auth_model.py b/sdk/python/feast/permissions/auth_model.py new file mode 100644 index 0000000000..afb0a22bc9 --- /dev/null +++ b/sdk/python/feast/permissions/auth_model.py @@ -0,0 +1,25 @@ +from typing import Literal, Optional + +from feast.repo_config import FeastConfigBaseModel + + +class AuthConfig(FeastConfigBaseModel): + type: Literal["oidc", "kubernetes", "no_auth"] = "no_auth" + + +class OidcAuthConfig(AuthConfig): + auth_server_url: Optional[str] = None + auth_discovery_url: str + client_id: str + client_secret: Optional[str] = None + username: str + password: str + realm: str = "master" + + +class NoAuthConfig(AuthConfig): + pass + + +class KubernetesAuthConfig(AuthConfig): + pass diff --git a/sdk/python/feast/permissions/client/__init__.py b/sdk/python/feast/permissions/client/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sdk/python/feast/permissions/client/arrow_flight_auth_interceptor.py b/sdk/python/feast/permissions/client/arrow_flight_auth_interceptor.py new file mode 100644 index 0000000000..724c7df5ca --- /dev/null +++ b/sdk/python/feast/permissions/client/arrow_flight_auth_interceptor.py @@ -0,0 +1,38 @@ +import pyarrow.flight as fl + +from feast.permissions.auth.auth_type import AuthType +from feast.permissions.auth_model import AuthConfig +from feast.permissions.client.auth_client_manager_factory import get_auth_token + + +class FlightBearerTokenInterceptor(fl.ClientMiddleware): + def __init__(self, auth_config: AuthConfig): + super().__init__() + self.auth_config = auth_config + + def call_completed(self, exception): + pass + + def received_headers(self, headers): + pass + + def sending_headers(self): + access_token = get_auth_token(self.auth_config) + return {b"authorization": b"Bearer " + access_token.encode("utf-8")} + + +class FlightAuthInterceptorFactory(fl.ClientMiddlewareFactory): + def __init__(self, auth_config: AuthConfig): + super().__init__() + self.auth_config = auth_config + + def start_call(self, info): + return FlightBearerTokenInterceptor(self.auth_config) + + +def build_arrow_flight_client(host: str, port, auth_config: AuthConfig): + if auth_config.type != AuthType.NONE.value: + middleware_factory = FlightAuthInterceptorFactory(auth_config) + return fl.FlightClient(f"grpc://{host}:{port}", middleware=[middleware_factory]) + else: + return fl.FlightClient(f"grpc://{host}:{port}") diff --git a/sdk/python/feast/permissions/client/auth_client_manager.py b/sdk/python/feast/permissions/client/auth_client_manager.py new file mode 100644 index 0000000000..82f9b7433e --- /dev/null +++ b/sdk/python/feast/permissions/client/auth_client_manager.py @@ -0,0 +1,8 @@ +from abc import ABC, abstractmethod + + +class AuthenticationClientManager(ABC): + @abstractmethod + def get_token(self) -> str: + """Retrieves the token based on the authentication type configuration""" + pass diff --git a/sdk/python/feast/permissions/client/auth_client_manager_factory.py b/sdk/python/feast/permissions/client/auth_client_manager_factory.py new file mode 100644 index 0000000000..4e49802047 --- /dev/null +++ b/sdk/python/feast/permissions/client/auth_client_manager_factory.py @@ -0,0 +1,30 @@ +from feast.permissions.auth.auth_type import AuthType +from feast.permissions.auth_model import ( + AuthConfig, + KubernetesAuthConfig, + OidcAuthConfig, +) +from feast.permissions.client.auth_client_manager import AuthenticationClientManager +from feast.permissions.client.kubernetes_auth_client_manager import ( + KubernetesAuthClientManager, +) +from feast.permissions.client.oidc_authentication_client_manager import ( + OidcAuthClientManager, +) + + +def get_auth_client_manager(auth_config: AuthConfig) -> AuthenticationClientManager: + if auth_config.type == AuthType.OIDC.value: + assert isinstance(auth_config, OidcAuthConfig) + return OidcAuthClientManager(auth_config) + elif auth_config.type == AuthType.KUBERNETES.value: + assert isinstance(auth_config, KubernetesAuthConfig) + return KubernetesAuthClientManager(auth_config) + else: + raise RuntimeError( + f"No Auth client manager implemented for the auth type:${auth_config.type}" + ) + + +def get_auth_token(auth_config: AuthConfig) -> str: + return get_auth_client_manager(auth_config).get_token() diff --git a/sdk/python/feast/permissions/client/grpc_client_auth_interceptor.py b/sdk/python/feast/permissions/client/grpc_client_auth_interceptor.py new file mode 100644 index 0000000000..98cc445c7b --- /dev/null +++ b/sdk/python/feast/permissions/client/grpc_client_auth_interceptor.py @@ -0,0 +1,52 @@ +import logging + +import grpc + +from feast.permissions.auth_model import AuthConfig +from feast.permissions.client.auth_client_manager_factory import get_auth_token + +logger = logging.getLogger(__name__) + + +class GrpcClientAuthHeaderInterceptor( + grpc.UnaryUnaryClientInterceptor, + grpc.UnaryStreamClientInterceptor, + grpc.StreamUnaryClientInterceptor, + grpc.StreamStreamClientInterceptor, +): + def __init__(self, auth_type: AuthConfig): + self._auth_type = auth_type + + def intercept_unary_unary( + self, continuation, client_call_details, request_iterator + ): + client_call_details = self._append_auth_header_metadata(client_call_details) + return continuation(client_call_details, request_iterator) + + def intercept_unary_stream( + self, continuation, client_call_details, request_iterator + ): + client_call_details = self._append_auth_header_metadata(client_call_details) + return continuation(client_call_details, request_iterator) + + def intercept_stream_unary( + self, continuation, client_call_details, request_iterator + ): + client_call_details = self._append_auth_header_metadata(client_call_details) + return continuation(client_call_details, request_iterator) + + def intercept_stream_stream( + self, continuation, client_call_details, request_iterator + ): + client_call_details = self._append_auth_header_metadata(client_call_details) + return continuation(client_call_details, request_iterator) + + def _append_auth_header_metadata(self, client_call_details): + logger.debug( + "Intercepted the grpc api method call to inject Authorization header " + ) + metadata = client_call_details.metadata or [] + access_token = get_auth_token(self._auth_type) + metadata.append((b"authorization", b"Bearer " + access_token.encode("utf-8"))) + client_call_details = client_call_details._replace(metadata=metadata) + return client_call_details diff --git a/sdk/python/feast/permissions/client/http_auth_requests_wrapper.py b/sdk/python/feast/permissions/client/http_auth_requests_wrapper.py new file mode 100644 index 0000000000..3232e25025 --- /dev/null +++ b/sdk/python/feast/permissions/client/http_auth_requests_wrapper.py @@ -0,0 +1,22 @@ +import requests +from requests import Session + +from feast.permissions.auth.auth_type import AuthType +from feast.permissions.auth_model import ( + AuthConfig, +) +from feast.permissions.client.auth_client_manager_factory import get_auth_token + + +class AuthenticatedRequestsSession(Session): + def __init__(self, auth_token: str): + super().__init__() + self.headers.update({"Authorization": f"Bearer {auth_token}"}) + + +def get_http_auth_requests_session(auth_config: AuthConfig) -> Session: + if auth_config.type == AuthType.NONE.value: + request_session = requests.session() + else: + request_session = AuthenticatedRequestsSession(get_auth_token(auth_config)) + return request_session diff --git a/sdk/python/feast/permissions/client/kubernetes_auth_client_manager.py b/sdk/python/feast/permissions/client/kubernetes_auth_client_manager.py new file mode 100644 index 0000000000..1ca3c5a2ae --- /dev/null +++ b/sdk/python/feast/permissions/client/kubernetes_auth_client_manager.py @@ -0,0 +1,43 @@ +import logging +import os + +from feast.permissions.auth_model import KubernetesAuthConfig +from feast.permissions.client.auth_client_manager import AuthenticationClientManager + +logger = logging.getLogger(__name__) + + +class KubernetesAuthClientManager(AuthenticationClientManager): + def __init__(self, auth_config: KubernetesAuthConfig): + self.auth_config = auth_config + self.token_file_path = "/var/run/secrets/kubernetes.io/serviceaccount/token" + + def get_token(self): + try: + token = self._read_token_from_file() + return token + except Exception as e: + logger.info(f"Error reading token from file: {e}") + logger.info("Attempting to read token from environment variable.") + try: + token = self._read_token_from_env() + return token + except Exception as env_e: + logger.exception( + f"Error reading token from environment variable: {env_e}" + ) + raise env_e + + def _read_token_from_file(self): + try: + with open(self.token_file_path, "r") as file: + token = file.read().strip() + return token + except Exception as e: + raise e + + def _read_token_from_env(self): + token = os.getenv("LOCAL_K8S_TOKEN") + if not token: + raise KeyError("LOCAL_K8S_TOKEN environment variable is not set.") + return token diff --git a/sdk/python/feast/permissions/client/oidc_authentication_client_manager.py b/sdk/python/feast/permissions/client/oidc_authentication_client_manager.py new file mode 100644 index 0000000000..544764aae0 --- /dev/null +++ b/sdk/python/feast/permissions/client/oidc_authentication_client_manager.py @@ -0,0 +1,58 @@ +import logging + +import requests + +from feast.permissions.auth_model import OidcAuthConfig +from feast.permissions.client.auth_client_manager import AuthenticationClientManager + +logger = logging.getLogger(__name__) + + +class OidcAuthClientManager(AuthenticationClientManager): + def __init__(self, auth_config: OidcAuthConfig): + self.auth_config = auth_config + + def _get_token_endpoint(self): + response = requests.get(self.auth_config.auth_discovery_url) + if response.status_code == 200: + oidc_config = response.json() + if not oidc_config["token_endpoint"]: + raise RuntimeError( + " OIDC token_endpoint is not available from discovery url response." + ) + return oidc_config["token_endpoint"].replace( + "master", self.auth_config.realm + ) + else: + raise RuntimeError( + f"Error fetching OIDC token endpoint configuration: {response.status_code} - {response.text}" + ) + + def get_token(self): + # Fetch the token endpoint from the discovery URL + token_endpoint = self._get_token_endpoint() + + token_request_body = { + "grant_type": "password", + "client_id": self.auth_config.client_id, + "client_secret": self.auth_config.client_secret, + "username": self.auth_config.username, + "password": self.auth_config.password, + } + headers = {"Content-Type": "application/x-www-form-urlencoded"} + + token_response = requests.post( + token_endpoint, data=token_request_body, headers=headers + ) + if token_response.status_code == 200: + access_token = token_response.json()["access_token"] + if not access_token: + logger.debug( + f"access_token is empty for the client_id=${self.auth_config.client_id}" + ) + raise RuntimeError("access token is empty") + return access_token + else: + raise RuntimeError( + f"""Failed to obtain oidc access token:url=[{token_endpoint}] {token_response.status_code} - {token_response.text}""" + ) diff --git a/sdk/python/feast/permissions/decision.py b/sdk/python/feast/permissions/decision.py new file mode 100644 index 0000000000..963befe831 --- /dev/null +++ b/sdk/python/feast/permissions/decision.py @@ -0,0 +1,114 @@ +import enum +import logging +from typing import Optional + +logger = logging.getLogger(__name__) + + +class DecisionStrategy(enum.Enum): + """ + The strategy to be adopted in case multiple permissions match an execution request. + """ + + UNANIMOUS = "unanimous" # All policies must evaluate to a positive decision for the final decision to be also positive. + AFFIRMATIVE = ( + "affirmative" # At least one policy must evaluate to a positive decision + ) + # The number of positive decisions must be greater than the number of negative decisions. + # If the number of positive and negative decisions is the same, the final decision will be negative. + CONSENSUS = "consensus" + + +class DecisionEvaluator: + """ + A class to implement the decision logic, according to the selected strategy. + + Args: + decision_strategy: The associated `DecisionStrategy`. + num_of_voters: The expected number of votes to complete the decision. + + Examples: + Create the instance and specify the strategy and number of decisions: + `evaluator = DecisionEvaluator(DecisionStrategy.UNANIMOUS, 3) + + For each vote that you receivem, add a decision grant: `evaluator.add_grant(vote, message)` + and check if the decision process ended: `if evaluator.is_decided():` + Once decided, get the result and the failure explanations using: + `grant, explanations = evaluator.grant()` + """ + + def __init__( + self, + num_of_voters: int, + ): + # Only AFFIRMATIVE strategy is managed available + decision_strategy = DecisionStrategy.AFFIRMATIVE + self.num_of_voters = num_of_voters + + self.grant_count = 0 + self.deny_count = 0 + + self.grant_quorum = ( + 1 + if decision_strategy == DecisionStrategy.AFFIRMATIVE + else num_of_voters + if decision_strategy == DecisionStrategy.UNANIMOUS + else num_of_voters // 2 + 1 + ) + self.deny_quorum = ( + num_of_voters + if decision_strategy == DecisionStrategy.AFFIRMATIVE + else 1 + if decision_strategy == DecisionStrategy.UNANIMOUS + else num_of_voters // 2 + (num_of_voters % 2) + ) + self.grant_decision: Optional[bool] = None + self.explanations: list[str] = [] + logger.info( + f"Decision evaluation started with grant_quorum={self.grant_quorum}, deny_quorum={self.deny_quorum}" + ) + + def is_decided(self) -> bool: + """ + Returns: + bool: `True` when the decision process completed (e.g. we added as many votes as specified in the `num_of_voters` creation argument). + """ + return self.grant_decision is not None + + def grant(self) -> tuple[bool, list[str]]: + """ + Returns: + tuple[bool, list[str]]: The tuple of decision computation: a `bool` with the computation decision and a `list[str]` with the + denial explanations (possibly empty). + """ + logger.info( + f"Decided grant is {self.grant_decision}, explanations={self.explanations}" + ) + return bool(self.grant_decision), self.explanations + + def add_grant(self, grant: bool, explanation: str): + """ + Add a single vote to the decision computation, with a possible denial reason. + If the evaluation process already ended, additional votes are discarded. + + Args: + grant: `True` is the decision is accepted, `False` otherwise. + explanation: Denial reason (not considered when `vote` is `True`). + """ + + if self.is_decided(): + logger.warning("Grant decision already decided, discarding vote") + return + if grant: + self.grant_count += 1 + else: + self.deny_count += 1 + self.explanations.append(explanation) + + if self.grant_count >= self.grant_quorum: + self.grant_decision = True + if self.deny_count >= self.deny_quorum: + self.grant_decision = False + logger.debug( + f"After new grant: grants={self.grant_count}, deny_count={self.deny_count}, grant_decision={self.grant_decision}" + ) diff --git a/sdk/python/feast/permissions/decorator.py b/sdk/python/feast/permissions/decorator.py new file mode 100644 index 0000000000..3b9f7a4ae3 --- /dev/null +++ b/sdk/python/feast/permissions/decorator.py @@ -0,0 +1,42 @@ +import logging +from typing import Union + +from feast.permissions.action import AuthzedAction +from feast.permissions.matcher import is_a_feast_object +from feast.permissions.security_manager import assert_permissions + +logger = logging.getLogger(__name__) + + +def require_permissions(actions: Union[list[AuthzedAction], AuthzedAction]): + """ + A decorator to define the actions that are executed from the decorated class method and that must be protected + against unauthorized access. + + The first parameter of the protected method must be `self` + Args: + actions: The list of actions that must be permitted to the current user. + """ + + def require_permissions_decorator(func): + def permission_checker(*args, **kwargs): + logger.debug(f"permission_checker for {args}, {kwargs}") + resource = args[0] + if not is_a_feast_object(resource): + raise NotImplementedError( + f"The first argument is not of a managed type but {type(resource)}" + ) + + return assert_permissions( + resource=resource, + actions=actions, + ) + logger.debug( + f"Current User can invoke {actions} on {resource.name}:{type(resource)} " + ) + result = func(*args, **kwargs) + return result + + return permission_checker + + return require_permissions_decorator diff --git a/sdk/python/feast/permissions/enforcer.py b/sdk/python/feast/permissions/enforcer.py new file mode 100644 index 0000000000..af41d12a2c --- /dev/null +++ b/sdk/python/feast/permissions/enforcer.py @@ -0,0 +1,77 @@ +import logging + +from feast.feast_object import FeastObject +from feast.permissions.decision import DecisionEvaluator +from feast.permissions.permission import ( + AuthzedAction, + Permission, +) +from feast.permissions.user import User + +logger = logging.getLogger(__name__) + + +def enforce_policy( + permissions: list[Permission], + user: User, + resources: list[FeastObject], + actions: list[AuthzedAction], + filter_only: bool = False, +) -> list[FeastObject]: + """ + Define the logic to apply the configured permissions when a given action is requested on + a protected resource. + + If no permissions are defined, the result is to allow the execution. + + Args: + permissions: The configured set of `Permission`. + user: The current user. + resources: The resources for which we need to enforce authorized permission. + actions: The requested actions to be authorized. + filter_only: If `True`, it removes unauthorized resources from the returned value, otherwise it raises a `PermissionError` the + first unauthorized resource. Defaults to `False`. + + Returns: + list[FeastObject]: A filtered list of the permitted resources. + + Raises: + PermissionError: If the current user is not authorized to eecute the requested actions on the given resources (and `filter_only` is `False`). + """ + if not permissions: + return resources + + _permitted_resources: list[FeastObject] = [] + for resource in resources: + logger.debug( + f"Enforcing permission policies for {type(resource).__name__}:{resource.name} to execute {actions}" + ) + matching_permissions = [ + p + for p in permissions + if p.match_resource(resource) and p.match_actions(actions) + ] + + if matching_permissions: + evaluator = DecisionEvaluator(len(matching_permissions)) + for p in matching_permissions: + permission_grant, permission_explanation = p.policy.validate_user( + user=user + ) + evaluator.add_grant( + permission_grant, + f"Permission {p.name} denied execution of {[a.value.upper() for a in actions]} to {type(resource).__name__}:{resource.name}: {permission_explanation}", + ) + + if evaluator.is_decided(): + grant, explanations = evaluator.grant() + if not grant and not filter_only: + raise PermissionError(",".join(explanations)) + if grant: + _permitted_resources.append(resource) + break + else: + message = f"No permissions defined to manage {actions} on {type(resource)}/{resource.name}." + logger.exception(f"**PERMISSION NOT GRANTED**: {message}") + raise PermissionError(message) + return _permitted_resources diff --git a/sdk/python/feast/permissions/matcher.py b/sdk/python/feast/permissions/matcher.py new file mode 100644 index 0000000000..337bfd5c57 --- /dev/null +++ b/sdk/python/feast/permissions/matcher.py @@ -0,0 +1,129 @@ +""" +This module provides utility matching functions. +""" + +import logging +import re +from typing import TYPE_CHECKING, Any, Optional +from unittest.mock import Mock + +from feast.permissions.action import AuthzedAction + +if TYPE_CHECKING: + from feast.feast_object import FeastObject + +logger = logging.getLogger(__name__) + + +def is_a_feast_object(resource: Any): + """ + A matcher to verify that a given object is one of the Feast objects defined in the `FeastObject` type. + + Args: + resource: An object instance to verify. + Returns: + `True` if the given object is one of the types in the FeastObject alias or a subclass of one of them. + """ + from feast.feast_object import ALL_RESOURCE_TYPES + + for t in ALL_RESOURCE_TYPES: + # Use isinstance to pass Mock validation + if isinstance(resource, t): + return True + return False + + +def _get_type(resource: "FeastObject") -> Any: + is_mock = isinstance(resource, Mock) + if not is_mock: + return type(resource) + else: + return getattr(resource, "_spec_class", None) + + +def resource_match_config( + resource: "FeastObject", + expected_types: list["FeastObject"], + name_pattern: Optional[str] = None, + required_tags: Optional[dict[str, str]] = None, +) -> bool: + """ + Match a given Feast object against the configured type, name and tags in a permission configuration. + + Args: + resource: A FeastObject instance to match agains the permission. + expected_types: The list of object types configured in the permission. Type match also includes all the sub-classes. + name_pattern: The optional name pattern filter configured in the permission. + required_tags: The optional dictionary of required tags configured in the permission. + + Returns: + bool: `True` if the resource matches the configured permission filters. + """ + if resource is None: + logger.warning(f"None passed to {resource_match_config.__name__}") + return False + + _type = _get_type(resource) + if not is_a_feast_object(resource): + logger.warning(f"Given resource is not of a managed type but {_type}") + return False + + # mypy check ignored because of https://github.com/python/mypy/issues/11673, or it raises "Argument 2 to "isinstance" has incompatible type "tuple[Featu ..." + if not isinstance(resource, tuple(expected_types)): # type: ignore + logger.info( + f"Resource does not match any of the expected type {expected_types}" + ) + return False + + if name_pattern is not None: + if hasattr(resource, "name"): + if isinstance(resource.name, str): + match = bool(re.fullmatch(name_pattern, resource.name)) + if not match: + logger.info( + f"Resource name {resource.name} does not match pattern {name_pattern}" + ) + return False + else: + logger.warning( + f"Resource {resource} has no `name` attribute of unexpected type {type(resource.name)}" + ) + else: + logger.warning(f"Resource {resource} has no `name` attribute") + + if required_tags: + if hasattr(resource, "required_tags"): + if isinstance(resource.required_tags, dict): + for tag in required_tags.keys(): + required_value = required_tags.get(tag) + actual_value = resource.required_tags.get(tag) + if required_value != actual_value: + logger.info( + f"Unmatched value {actual_value} for required tag {tag}: expected {required_value}" + ) + return False + else: + logger.warning( + f"Resource {resource} has no `required_tags` attribute of unexpected type {type(resource.required_tags)}" + ) + else: + logger.warning(f"Resource {resource} has no `required_tags` attribute") + + return True + + +def actions_match_config( + requested_actions: list[AuthzedAction], + allowed_actions: list[AuthzedAction], +) -> bool: + """ + Match a list of actions against the actions defined in a permission configuration. + + Args: + requested_actions: A list of actions to be executed. + allowed_actions: The list of actions configured in the permission. + + Returns: + bool: `True` if all the given `requested_actions` are defined in the `allowed_actions`. + """ + return all(a in allowed_actions for a in requested_actions) diff --git a/sdk/python/feast/permissions/permission.py b/sdk/python/feast/permissions/permission.py new file mode 100644 index 0000000000..1117a3ee82 --- /dev/null +++ b/sdk/python/feast/permissions/permission.py @@ -0,0 +1,269 @@ +import logging +import re +from abc import ABC +from datetime import datetime +from typing import TYPE_CHECKING, Any, Dict, Optional, Union + +from google.protobuf.json_format import MessageToJson + +from feast.importer import import_class +from feast.permissions.action import ALL_ACTIONS, AuthzedAction +from feast.permissions.matcher import actions_match_config, resource_match_config +from feast.permissions.policy import AllowAll, Policy +from feast.protos.feast.core.Permission_pb2 import Permission as PermissionProto +from feast.protos.feast.core.Permission_pb2 import PermissionMeta as PermissionMetaProto +from feast.protos.feast.core.Permission_pb2 import PermissionSpec as PermissionSpecProto + +if TYPE_CHECKING: + from feast.feast_object import FeastObject + +logger = logging.getLogger(__name__) + +""" +Constant to refer to all the managed types. +""" + + +class Permission(ABC): + """ + The Permission class defines the authorization policy to be validated whenever the identified actions are + requested on the matching resources. + + Attributes: + name: The permission name (can be duplicated, used for logging troubleshooting). + types: The list of protected resource types as defined by the `FeastObject` type. The match includes all the sub-classes of the given types. + Defaults to all managed types (e.g. the `ALL_RESOURCE_TYPES` constant) + name_pattern: A regex to match the resource name. Defaults to None, meaning that no name filtering is applied + be present in a resource tags with the given value. Defaults to None, meaning that no tags filtering is applied. + actions: The actions authorized by this permission. Defaults to `ALL_ACTIONS`. + policy: The policy to be applied to validate a client request. + tags: A dictionary of key-value pairs to store arbitrary metadata. + required_tags: Dictionary of key-value pairs that must match the resource tags. All these tags must + """ + + _name: str + _types: list["FeastObject"] + _name_pattern: Optional[str] + _actions: list[AuthzedAction] + _policy: Policy + _tags: Dict[str, str] + _required_tags: dict[str, str] + created_timestamp: Optional[datetime] + last_updated_timestamp: Optional[datetime] + + def __init__( + self, + name: str, + types: Optional[Union[list["FeastObject"], "FeastObject"]] = None, + name_pattern: Optional[str] = None, + actions: Union[list[AuthzedAction], AuthzedAction] = ALL_ACTIONS, + policy: Policy = AllowAll, + tags: Optional[dict[str, str]] = None, + required_tags: Optional[dict[str, str]] = None, + ): + from feast.feast_object import ALL_RESOURCE_TYPES + + if not types: + types = ALL_RESOURCE_TYPES + for t in types if isinstance(types, list) else [types]: + if t not in ALL_RESOURCE_TYPES: + raise ValueError(f"{t} is not one of the managed types") + if actions is None or not actions: + raise ValueError("The list 'actions' must be non-empty.") + if not policy: + raise ValueError("The list 'policy' must be non-empty.") + self._name = name + self._types = types if isinstance(types, list) else [types] + self._name_pattern = _normalize_name_pattern(name_pattern) + self._actions = actions if isinstance(actions, list) else [actions] + self._policy = policy + self._tags = _normalize_tags(tags) + self._required_tags = _normalize_tags(required_tags) + self.created_timestamp = None + self.last_updated_timestamp = None + + def __eq__(self, other): + if not isinstance(other, Permission): + raise TypeError("Comparisons should only involve Permission class objects.") + + if ( + self.name != other.name + or self.name_pattern != other.name_pattern + or self.tags != other.tags + or self.policy != other.policy + or self.actions != other.actions + or self.required_tags != other.required_tags + ): + return False + + if set(self.types) != set(other.types): + return False + + return True + + def __hash__(self): + return hash(self.name) + + def __str__(self): + return str(MessageToJson(self.to_proto())) + + @property + def name(self) -> str: + return self._name + + @property + def types(self) -> list["FeastObject"]: + return self._types + + @property + def name_pattern(self) -> Optional[str]: + return self._name_pattern + + @property + def actions(self) -> list[AuthzedAction]: + return self._actions + + @property + def policy(self) -> Policy: + return self._policy + + @property + def tags(self) -> Dict[str, str]: + return self._tags + + @property + def required_tags(self) -> Dict[str, str]: + return self._required_tags + + def match_resource(self, resource: "FeastObject") -> bool: + """ + Returns: + `True` when the given resource matches the type, name and tags filters defined in the permission. + """ + return resource_match_config( + resource=resource, + expected_types=self.types, + name_pattern=self.name_pattern, + required_tags=self.required_tags, + ) + + def match_actions(self, requested_actions: list[AuthzedAction]) -> bool: + """ + Returns: + `True` when the given actions are included in the permitted actions. + """ + return actions_match_config( + allowed_actions=self.actions, + requested_actions=requested_actions, + ) + + @staticmethod + def from_proto(permission_proto: PermissionProto) -> Any: + """ + Converts permission config in protobuf spec to a Permission class object. + + Args: + permission_proto: A protobuf representation of a Permission. + + Returns: + A Permission class object. + """ + + types = [ + get_type_class_from_permission_type( + _PERMISSION_TYPES[PermissionSpecProto.Type.Name(t)] + ) + for t in permission_proto.spec.types + ] + actions = [ + AuthzedAction[PermissionSpecProto.AuthzedAction.Name(action)] + for action in permission_proto.spec.actions + ] + + permission = Permission( + permission_proto.spec.name, + types, + permission_proto.spec.name_pattern or None, + actions, + Policy.from_proto(permission_proto.spec.policy), + dict(permission_proto.spec.tags) or None, + dict(permission_proto.spec.required_tags) or None, + ) + + if permission_proto.meta.HasField("created_timestamp"): + permission.created_timestamp = ( + permission_proto.meta.created_timestamp.ToDatetime() + ) + if permission_proto.meta.HasField("last_updated_timestamp"): + permission.last_updated_timestamp = ( + permission_proto.meta.last_updated_timestamp.ToDatetime() + ) + + return permission + + def to_proto(self) -> PermissionProto: + """ + Converts a PermissionProto object to its protobuf representation. + """ + types = [ + PermissionSpecProto.Type.Value( + re.sub(r"([a-z])([A-Z])", r"\1_\2", t.__name__).upper() # type: ignore[union-attr] + ) + for t in self.types + ] + + actions = [ + PermissionSpecProto.AuthzedAction.Value(action.name) + for action in self.actions + ] + + permission_spec = PermissionSpecProto( + name=self.name, + types=types, + name_pattern=self.name_pattern if self.name_pattern is not None else "", + actions=actions, + policy=self.policy.to_proto(), + tags=self.tags, + required_tags=self.required_tags, + ) + + meta = PermissionMetaProto() + if self.created_timestamp: + meta.created_timestamp.FromDatetime(self.created_timestamp) + if self.last_updated_timestamp: + meta.last_updated_timestamp.FromDatetime(self.last_updated_timestamp) + + return PermissionProto(spec=permission_spec, meta=meta) + + +def _normalize_name_pattern(name_pattern: Optional[str]): + if name_pattern is not None: + return name_pattern.strip() + return None + + +def _normalize_tags(tags: Optional[dict[str, str]]): + if tags: + return { + k.strip(): v.strip() if isinstance(v, str) else v for k, v in tags.items() + } + return None + + +def get_type_class_from_permission_type(permission_type: str): + module_name, config_class_name = permission_type.rsplit(".", 1) + return import_class(module_name, config_class_name) + + +_PERMISSION_TYPES = { + "FEATURE_VIEW": "feast.feature_view.FeatureView", + "ON_DEMAND_FEATURE_VIEW": "feast.on_demand_feature_view.OnDemandFeatureView", + "BATCH_FEATURE_VIEW": "feast.batch_feature_view.BatchFeatureView", + "STREAM_FEATURE_VIEW": "feast.stream_feature_view.StreamFeatureView", + "ENTITY": "feast.entity.Entity", + "FEATURE_SERVICE": "feast.feature_service.FeatureService", + "DATA_SOURCE": "feast.data_source.DataSource", + "VALIDATION_REFERENCE": "feast.saved_dataset.ValidationReference", + "SAVED_DATASET": "feast.saved_dataset.SavedDataset", + "PERMISSION": "feast.permissions.permission.Permission", +} diff --git a/sdk/python/feast/permissions/policy.py b/sdk/python/feast/permissions/policy.py new file mode 100644 index 0000000000..271448422f --- /dev/null +++ b/sdk/python/feast/permissions/policy.py @@ -0,0 +1,129 @@ +from abc import ABC, abstractmethod +from typing import Any + +from feast.permissions.user import User +from feast.protos.feast.core.Policy_pb2 import Policy as PolicyProto +from feast.protos.feast.core.Policy_pb2 import RoleBasedPolicy as RoleBasedPolicyProto + + +class Policy(ABC): + """ + An abstract class to ensure that the current user matches the configured security policies. + """ + + @abstractmethod + def validate_user(self, user: User) -> tuple[bool, str]: + """ + Validate the given user against the configured policy. + + Args: + user: The current user. + + Returns: + bool: `True` if the user matches the policy criteria, `False` otherwise. + str: A possibly empty explanation of the reason for not matching the configured policy. + """ + raise NotImplementedError + + @staticmethod + def from_proto(policy_proto: PolicyProto) -> Any: + """ + Converts policy config in protobuf spec to a Policy class object. + + Args: + policy_proto: A protobuf representation of a Policy. + + Returns: + A Policy class object. + """ + policy_type = policy_proto.WhichOneof("policy_type") + if policy_type == "role_based_policy": + return RoleBasedPolicy.from_proto(policy_proto) + if policy_type is None: + return None + raise NotImplementedError(f"policy_type is unsupported: {policy_type}") + + @abstractmethod + def to_proto(self) -> PolicyProto: + """ + Converts a PolicyProto object to its protobuf representation. + """ + raise NotImplementedError + + +class RoleBasedPolicy(Policy): + """ + A `Policy` implementation where the user roles must be enforced to grant access to the requested action. + At least one of the configured roles must be granted to the current user in order to allow the execution of the secured operation. + + E.g., if the policy enforces roles `a` and `b`, the user must have at least one of them in order to satisfy the policy. + """ + + def __init__( + self, + roles: list[str], + ): + self.roles = roles + + def __eq__(self, other): + if not isinstance(other, RoleBasedPolicy): + raise TypeError( + "Comparisons should only involve RoleBasedPolicy class objects." + ) + + if sorted(self.roles) != sorted(other.roles): + return False + + return True + + def get_roles(self) -> list[str]: + return self.roles + + def validate_user(self, user: User) -> tuple[bool, str]: + """ + Validate the given `user` against the configured roles. + """ + result = user.has_matching_role(self.roles) + explain = "" if result else f"Requires roles {self.roles}" + return (result, explain) + + @staticmethod + def from_proto(policy_proto: PolicyProto) -> Any: + """ + Converts policy config in protobuf spec to a Policy class object. + + Args: + policy_proto: A protobuf representation of a Policy. + + Returns: + A RoleBasedPolicy class object. + """ + return RoleBasedPolicy(roles=list(policy_proto.role_based_policy.roles)) + + def to_proto(self) -> PolicyProto: + """ + Converts a PolicyProto object to its protobuf representation. + """ + + role_based_policy_proto = RoleBasedPolicyProto(roles=self.roles) + policy_proto = PolicyProto(role_based_policy=role_based_policy_proto) + + return policy_proto + + +def allow_all(self, user: User) -> tuple[bool, str]: + return True, "" + + +def empty_policy(self) -> PolicyProto: + return PolicyProto() + + +""" +A `Policy` instance to allow execution of any action to each user +""" +AllowAll = type( + "AllowAll", + (Policy,), + {Policy.validate_user.__name__: allow_all, Policy.to_proto.__name__: empty_policy}, +)() diff --git a/sdk/python/feast/permissions/security_manager.py b/sdk/python/feast/permissions/security_manager.py new file mode 100644 index 0000000000..178db6e6e9 --- /dev/null +++ b/sdk/python/feast/permissions/security_manager.py @@ -0,0 +1,163 @@ +import logging +from contextvars import ContextVar +from typing import List, Optional, Union + +from feast.feast_object import FeastObject +from feast.infra.registry.base_registry import BaseRegistry +from feast.permissions.action import AuthzedAction +from feast.permissions.enforcer import enforce_policy +from feast.permissions.permission import Permission +from feast.permissions.user import User + +logger = logging.getLogger(__name__) + + +class SecurityManager: + """ + The security manager it's the entry point to validate the configuration of the current user against the configured permission policies. + It is accessed and defined using the global functions `get_security_manager` and `set_security_manager` + """ + + def __init__( + self, + project: str, + registry: BaseRegistry, + ): + self._project = project + self._registry = registry + self._current_user: ContextVar[Optional[User]] = ContextVar( + "current_user", default=None + ) + + def set_current_user(self, current_user: User): + """ + Init the user for the current context. + """ + self._current_user.set(current_user) + + @property + def current_user(self) -> Optional[User]: + """ + Returns: + str: the possibly empty instance of the current user. `contextvars` module is used to ensure that each concurrent request has its own + individual user. + """ + return self._current_user.get() + + @property + def permissions(self) -> list[Permission]: + """ + Returns: + list[Permission]: the list of `Permission` configured in the Feast registry. + """ + return self._registry.list_permissions(project=self._project) + + def assert_permissions( + self, + resources: list[FeastObject], + actions: Union[AuthzedAction, List[AuthzedAction]], + filter_only: bool = False, + ) -> list[FeastObject]: + """ + Verify if the current user is authorized ro execute the requested actions on the given resources. + + If no permissions are defined, the result is to allow the execution. + + Args: + resources: The resources for which we need to enforce authorized permission. + actions: The requested actions to be authorized. + filter_only: If `True`, it removes unauthorized resources from the returned value, otherwise it raises a `PermissionError` the + first unauthorized resource. Defaults to `False`. + + Returns: + list[FeastObject]: A filtered list of the permitted resources, possibly empty. + + Raises: + PermissionError: If the current user is not authorized to eecute all the requested actions on the given resources. + """ + return enforce_policy( + permissions=self.permissions, + user=self.current_user if self.current_user is not None else User("", []), + resources=resources, + actions=actions if isinstance(actions, list) else [actions], + filter_only=filter_only, + ) + + +def assert_permissions( + resource: FeastObject, + actions: Union[AuthzedAction, List[AuthzedAction]], +) -> FeastObject: + """ + A utility function to invoke the `assert_permissions` method on the global security manager. + + If no global `SecurityManager` is defined, the execution is permitted. + + Args: + resource: The resource for which we need to enforce authorized permission. + actions: The requested actions to be authorized. + Returns: + FeastObject: The original `resource`, if permitted. + + Raises: + PermissionError: If the current user is not authorized to execute the requested actions on the given resources. + """ + sm = get_security_manager() + if sm is None: + return resource + return sm.assert_permissions( + resources=[resource], actions=actions, filter_only=False + )[0] + + +def permitted_resources( + resources: list[FeastObject], + actions: Union[AuthzedAction, List[AuthzedAction]], +) -> list[FeastObject]: + """ + A utility function to invoke the `assert_permissions` method on the global security manager. + + If no global `SecurityManager` is defined, the execution is permitted. + + Args: + resources: The resources for which we need to enforce authorized permission. + actions: The requested actions to be authorized. + Returns: + list[FeastObject]]: A filtered list of the permitted resources, possibly empty. + """ + sm = get_security_manager() + if sm is None: + return resources + return sm.assert_permissions(resources=resources, actions=actions, filter_only=True) + + +""" +The possibly empty global instance of `SecurityManager`. +""" +_sm: Optional[SecurityManager] = None + + +def get_security_manager() -> Optional[SecurityManager]: + """ + Return the global instance of `SecurityManager`. + """ + global _sm + return _sm + + +def set_security_manager(sm: SecurityManager): + """ + Initialize the global instance of `SecurityManager`. + """ + + global _sm + _sm = sm + + +def no_security_manager(): + """ + Initialize the empty global instance of `SecurityManager`. + """ + + global _sm + _sm = None diff --git a/sdk/python/feast/permissions/server/__init__.py b/sdk/python/feast/permissions/server/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sdk/python/feast/permissions/server/arrow.py b/sdk/python/feast/permissions/server/arrow.py new file mode 100644 index 0000000000..5eba7d0916 --- /dev/null +++ b/sdk/python/feast/permissions/server/arrow.py @@ -0,0 +1,111 @@ +""" +A module with utility functions and classes to support authorizing the Arrow Flight servers. +""" + +import asyncio +import functools +import logging +from typing import Optional, cast + +import pyarrow.flight as fl +from pyarrow.flight import ServerCallContext + +from feast.permissions.auth.auth_manager import ( + get_auth_manager, +) +from feast.permissions.security_manager import get_security_manager +from feast.permissions.server.utils import ( + AuthManagerType, +) +from feast.permissions.user import User + +logger = logging.getLogger(__name__) +logger.setLevel(logging.INFO) + + +def arrowflight_middleware( + auth_type: AuthManagerType, +) -> Optional[dict[str, fl.ServerMiddlewareFactory]]: + """ + A dictionary with the configured middlewares to support extracting the user details when the authorization manager is defined. + The authorization middleware key is `auth`. + + Returns: + dict[str, fl.ServerMiddlewareFactory]: Optional dictionary of middlewares. If the authorization type is set to `NONE`, it returns `None`. + """ + + if auth_type == AuthManagerType.NONE: + return None + + return { + "auth": AuthorizationMiddlewareFactory(), + } + + +class AuthorizationMiddlewareFactory(fl.ServerMiddlewareFactory): + """ + A middleware factory to intercept the authorization header and propagate it to the authorization middleware. + """ + + def __init__(self): + pass + + def start_call(self, info, headers): + """ + Intercept the authorization header and propagate it to the authorization middleware. + """ + access_token = get_auth_manager().token_extractor.extract_access_token( + headers=headers + ) + return AuthorizationMiddleware(access_token=access_token) + + +class AuthorizationMiddleware(fl.ServerMiddleware): + """ + A server middleware holding the authorization header and offering a method to extract the user credentials. + """ + + def __init__(self, access_token: str): + self.access_token = access_token + + def call_completed(self, exception): + if exception: + print(f"{AuthorizationMiddleware.__name__} received {exception}") + + async def extract_user(self) -> User: + """ + Use the configured `TokenParser` to extract the user credentials. + """ + return await get_auth_manager().token_parser.user_details_from_access_token( + self.access_token + ) + + +def inject_user_details(context: ServerCallContext): + """ + Function to use in Arrow Flight endpoints (e.g. `do_get`, `do_put` and so on) to access the token extracted from the header, + extract the user details out of it and propagate them to the current security manager, if any. + + Args: + context: The endpoint context. + """ + if context.get_middleware("auth") is None: + logger.info("No `auth` middleware.") + return + + sm = get_security_manager() + if sm is not None: + auth_middleware = cast(AuthorizationMiddleware, context.get_middleware("auth")) + current_user = asyncio.run(auth_middleware.extract_user()) + print(f"extracted user: {current_user}") + + sm.set_current_user(current_user) + + +def inject_user_details_decorator(func): + @functools.wraps(func) + def wrapper(self, context, *args, **kwargs): + inject_user_details(context) + return func(self, context, *args, **kwargs) + + return wrapper diff --git a/sdk/python/feast/permissions/server/arrow_flight_token_extractor.py b/sdk/python/feast/permissions/server/arrow_flight_token_extractor.py new file mode 100644 index 0000000000..2378fa8b19 --- /dev/null +++ b/sdk/python/feast/permissions/server/arrow_flight_token_extractor.py @@ -0,0 +1,42 @@ +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.auth.token_extractor import TokenExtractor + + +class ArrowFlightTokenExtractor(TokenExtractor): + def extract_access_token(self, **kwargs) -> str: + """ + Token extractor for Arrow Flight requests. + + Requires a keyword argument called `headers` of type `dict`. + + Returns: + The extracted access token. + """ + + if "headers" not in kwargs: + raise ValueError("Missing keywork argument 'headers'") + if not isinstance(kwargs["headers"], dict): + raise ValueError( + f"The keywork argument 'headers' is not of the expected type {dict.__name__}" + ) + + access_token = None + headers = kwargs["headers"] + if isinstance(headers, dict): + for header in headers: + if header.lower() == "authorization": + # With Arrow Flight, the header value is a list and we take the 0-th element + if not isinstance(headers[header], list): + raise AuthenticationError( + f"Authorization header must be of type list, found {type(headers[header])}" + ) + + return self._extract_bearer_token(headers[header][0]) + + if access_token is None: + raise AuthenticationError("Missing authorization header") + + return access_token diff --git a/sdk/python/feast/permissions/server/grpc.py b/sdk/python/feast/permissions/server/grpc.py new file mode 100644 index 0000000000..3c94240869 --- /dev/null +++ b/sdk/python/feast/permissions/server/grpc.py @@ -0,0 +1,54 @@ +import asyncio +import logging +from typing import Optional + +import grpc + +from feast.permissions.auth.auth_manager import ( + get_auth_manager, +) +from feast.permissions.security_manager import get_security_manager +from feast.permissions.server.utils import ( + AuthManagerType, +) + +logger = logging.getLogger(__name__) +logger.setLevel(logging.INFO) + + +def grpc_interceptors( + auth_type: AuthManagerType, +) -> Optional[list[grpc.ServerInterceptor]]: + """ + A list of the authorization interceptors. + + Args: + auth_type: The type of authorization manager, from the feature store configuration. + + Returns: + list[grpc.ServerInterceptor]: Optional list of interceptors. If the authorization type is set to `NONE`, it returns `None`. + """ + if auth_type == AuthManagerType.NONE: + return None + + return [AuthInterceptor()] + + +class AuthInterceptor(grpc.ServerInterceptor): + def intercept_service(self, continuation, handler_call_details): + sm = get_security_manager() + + if sm is not None: + auth_manager = get_auth_manager() + access_token = auth_manager.token_extractor.extract_access_token( + metadata=dict(handler_call_details.invocation_metadata) + ) + + print(f"Fetching user for token: {len(access_token)}") + current_user = asyncio.run( + auth_manager.token_parser.user_details_from_access_token(access_token) + ) + print(f"User is: {current_user}") + sm.set_current_user(current_user) + + return continuation(handler_call_details) diff --git a/sdk/python/feast/permissions/server/grpc_token_extractor.py b/sdk/python/feast/permissions/server/grpc_token_extractor.py new file mode 100644 index 0000000000..d75a18ded5 --- /dev/null +++ b/sdk/python/feast/permissions/server/grpc_token_extractor.py @@ -0,0 +1,36 @@ +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.auth.token_extractor import TokenExtractor + + +class GrpcTokenExtractor(TokenExtractor): + def extract_access_token(self, **kwargs) -> str: + """ + Token extractor for grpc server requests. + + Requires a keyword argument called `metadata` of type `dict`. + + Returns: + The extracted access token. + """ + + if "metadata" not in kwargs: + raise ValueError("Missing keywork argument 'metadata'") + if not isinstance(kwargs["metadata"], dict): + raise ValueError( + f"The keywork argument 'metadata' is not of the expected type {dict.__name__} but {type(kwargs['metadata'])}" + ) + + access_token = None + metadata = kwargs["metadata"] + if isinstance(metadata, dict): + for header in metadata: + if header.lower() == "authorization": + return self._extract_bearer_token(metadata[header]) + + if access_token is None: + raise AuthenticationError("Missing authorization header") + + return access_token diff --git a/sdk/python/feast/permissions/server/rest.py b/sdk/python/feast/permissions/server/rest.py new file mode 100644 index 0000000000..ecced3b34a --- /dev/null +++ b/sdk/python/feast/permissions/server/rest.py @@ -0,0 +1,33 @@ +""" +A module with utility functions to support authorizing the REST servers using the FastAPI framework. +""" + +from typing import Any + +from fastapi.requests import Request + +from feast.permissions.auth.auth_manager import ( + get_auth_manager, +) +from feast.permissions.security_manager import get_security_manager + + +async def inject_user_details(request: Request) -> Any: + """ + A function to extract the authorization token from a user request, extract the user details and propagate them to the + current security manager, if any. + """ + sm = get_security_manager() + current_user = None + if sm is not None: + auth_manager = get_auth_manager() + access_token = auth_manager.token_extractor.extract_access_token( + request=request + ) + current_user = await auth_manager.token_parser.user_details_from_access_token( + access_token=access_token + ) + + sm.set_current_user(current_user) + + return current_user diff --git a/sdk/python/feast/permissions/server/rest_token_extractor.py b/sdk/python/feast/permissions/server/rest_token_extractor.py new file mode 100644 index 0000000000..894c18eedb --- /dev/null +++ b/sdk/python/feast/permissions/server/rest_token_extractor.py @@ -0,0 +1,38 @@ +from fastapi.requests import Request +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.auth.token_extractor import TokenExtractor + + +class RestTokenExtractor(TokenExtractor): + def extract_access_token(self, **kwargs) -> str: + """ + Token extractor for REST requests. + + Requires a keyword argument called `request` of type `Request` + + Returns: + The extracted access token. + """ + + if "request" not in kwargs: + raise ValueError("Missing keywork argument 'request'") + if not isinstance(kwargs["request"], Request): + raise ValueError( + f"The keywork argument 'request' is not of the expected type {Request.__name__}" + ) + + access_token = None + request = kwargs["request"] + if isinstance(request, Request): + headers = request.headers + for header in headers: + if header.lower() == "authorization": + return self._extract_bearer_token(headers[header]) + + if access_token is None: + raise AuthenticationError("Missing authorization header") + + return access_token diff --git a/sdk/python/feast/permissions/server/utils.py b/sdk/python/feast/permissions/server/utils.py new file mode 100644 index 0000000000..34a2c0024a --- /dev/null +++ b/sdk/python/feast/permissions/server/utils.py @@ -0,0 +1,126 @@ +""" +A module with utility functions to support the authorization management in Feast servers. +""" + +import enum +import logging + +import feast +from feast.permissions.auth.auth_manager import ( + AllowAll, + AuthManager, + set_auth_manager, +) +from feast.permissions.auth.kubernetes_token_parser import KubernetesTokenParser +from feast.permissions.auth.oidc_token_parser import OidcTokenParser +from feast.permissions.auth.token_extractor import TokenExtractor +from feast.permissions.auth.token_parser import TokenParser +from feast.permissions.auth_model import AuthConfig, OidcAuthConfig +from feast.permissions.security_manager import ( + SecurityManager, + no_security_manager, + set_security_manager, +) +from feast.permissions.server.arrow_flight_token_extractor import ( + ArrowFlightTokenExtractor, +) +from feast.permissions.server.grpc_token_extractor import GrpcTokenExtractor +from feast.permissions.server.rest_token_extractor import RestTokenExtractor + +logger = logging.getLogger(__name__) +logger.setLevel(logging.INFO) + + +class ServerType(enum.Enum): + """ + Identify the server type. + """ + + REST = "rest" + ARROW = "arrow" + GRPC = "grpc" # TODO RBAC: to be completed + + +class AuthManagerType(enum.Enum): + """ + Identify the type of authorization manager. + """ + + NONE = "no_auth" + OIDC = "oidc" + KUBERNETES = "kubernetes" + + +def str_to_auth_manager_type(value: str) -> AuthManagerType: + for t in AuthManagerType.__members__.values(): + if t.value.lower() == value.lower(): + return t + + logger.warning( + f"Requested unmanaged AuthManagerType of value {value}. Using NONE instead." + ) + return AuthManagerType.NONE + + +def init_security_manager(auth_type: AuthManagerType, fs: "feast.FeatureStore"): + """ + Initialize the global security manager. + Must be invoked at Feast server initialization time to create the `SecurityManager` instance. + + Args: + auth_type: The authorization manager type. + registry: The feature store registry. + """ + if auth_type == AuthManagerType.NONE: + no_security_manager() + else: + # TODO permissions from registry + set_security_manager( + SecurityManager( + project=fs.project, + registry=fs.registry, + ) + ) + + +def init_auth_manager( + server_type: ServerType, auth_type: AuthManagerType, auth_config: AuthConfig +): + """ + Initialize the global authorization manager. + Must be invoked at Feast server initialization time to create the `AuthManager` instance. + + Args: + server_type: The server type. + auth_type: The authorization manager type. + + Raises: + ValueError: If any input argument has an unmanaged value. + """ + if auth_type == AuthManagerType.NONE: + set_auth_manager(AllowAll()) + else: + token_extractor: TokenExtractor + token_parser: TokenParser + + if server_type == ServerType.REST: + token_extractor = RestTokenExtractor() + elif server_type == ServerType.ARROW: + token_extractor = ArrowFlightTokenExtractor() + elif server_type == ServerType.GRPC: + token_extractor = GrpcTokenExtractor() + else: + raise ValueError(f"Unmanaged server type {server_type}") + + if auth_type == AuthManagerType.KUBERNETES: + token_parser = KubernetesTokenParser() + elif auth_type == AuthManagerType.OIDC: + assert isinstance(auth_config, OidcAuthConfig) + token_parser = OidcTokenParser(auth_config=auth_config) + else: + raise ValueError(f"Unmanaged authorization manager type {auth_type}") + + auth_manager = AuthManager( + token_extractor=token_extractor, token_parser=token_parser + ) + set_auth_manager(auth_manager) diff --git a/sdk/python/feast/permissions/user.py b/sdk/python/feast/permissions/user.py new file mode 100644 index 0000000000..783b683de6 --- /dev/null +++ b/sdk/python/feast/permissions/user.py @@ -0,0 +1,38 @@ +import logging + +logger = logging.getLogger(__name__) + + +class User: + _username: str + _roles: list[str] + + def __init__(self, username: str, roles: list[str]): + self._username = username + self._roles = roles + + @property + def username(self): + return self._username + + @property + def roles(self): + return self._roles + + def has_matching_role(self, requested_roles: list[str]) -> bool: + """ + Verify that the user has at least one of the requested roles. + + Args: + requested_roles: The list of requested roles. + + Returns: + bool: `True` only if the user has any registered role and all the given roles are registered. + """ + logger.debug( + f"Check {self.username} has all {requested_roles}: currently {self.roles}" + ) + return any(role in self.roles for role in requested_roles) + + def __str__(self): + return f"{self.username} ({self.roles})" diff --git a/sdk/python/feast/registry_server.py b/sdk/python/feast/registry_server.py index 53acb9f625..497036dfa3 100644 --- a/sdk/python/feast/registry_server.py +++ b/sdk/python/feast/registry_server.py @@ -1,17 +1,29 @@ from concurrent import futures from datetime import datetime, timezone +from typing import Union, cast import grpc from google.protobuf.empty_pb2 import Empty -from feast import FeatureStore +from feast import FeatureService, FeatureStore from feast.data_source import DataSource from feast.entity import Entity -from feast.feature_service import FeatureService +from feast.errors import FeatureViewNotFoundException +from feast.feast_object import FeastObject from feast.feature_view import FeatureView from feast.infra.infra_object import Infra from feast.infra.registry.base_registry import BaseRegistry from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.action import CRUD, AuthzedAction +from feast.permissions.permission import Permission +from feast.permissions.security_manager import assert_permissions, permitted_resources +from feast.permissions.server.grpc import grpc_interceptors +from feast.permissions.server.utils import ( + ServerType, + init_auth_manager, + init_security_manager, + str_to_auth_manager_type, +) from feast.protos.feast.registry import RegistryServer_pb2, RegistryServer_pb2_grpc from feast.saved_dataset import SavedDataset, ValidationReference from feast.stream_feature_view import StreamFeatureView @@ -24,30 +36,55 @@ def __init__(self, registry: BaseRegistry) -> None: def ApplyEntity(self, request: RegistryServer_pb2.ApplyEntityRequest, context): self.proxied_registry.apply_entity( - entity=Entity.from_proto(request.entity), + entity=cast( + Entity, + assert_permissions( + resource=Entity.from_proto(request.entity), + actions=CRUD, + ), + ), project=request.project, commit=request.commit, ) + return Empty() def GetEntity(self, request: RegistryServer_pb2.GetEntityRequest, context): - return self.proxied_registry.get_entity( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + self.proxied_registry.get_entity( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ListEntities(self, request: RegistryServer_pb2.ListEntitiesRequest, context): return RegistryServer_pb2.ListEntitiesResponse( entities=[ entity.to_proto() - for entity in self.proxied_registry.list_entities( - project=request.project, - allow_cache=request.allow_cache, - tags=dict(request.tags), + for entity in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_entities( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) def DeleteEntity(self, request: RegistryServer_pb2.DeleteEntityRequest, context): + assert_permissions( + resource=self.proxied_registry.get_entity( + name=request.name, project=request.project + ), + actions=AuthzedAction.DELETE, + ) + self.proxied_registry.delete_entity( name=request.name, project=request.project, commit=request.commit ) @@ -56,16 +93,30 @@ def DeleteEntity(self, request: RegistryServer_pb2.DeleteEntityRequest, context) def ApplyDataSource( self, request: RegistryServer_pb2.ApplyDataSourceRequest, context ): - self.proxied_registry.apply_data_source( - data_source=DataSource.from_proto(request.data_source), - project=request.project, - commit=request.commit, + ( + self.proxied_registry.apply_data_source( + data_source=cast( + DataSource, + assert_permissions( + resource=DataSource.from_proto(request.data_source), + actions=CRUD, + ), + ), + project=request.project, + commit=request.commit, + ), ) + return Empty() def GetDataSource(self, request: RegistryServer_pb2.GetDataSourceRequest, context): - return self.proxied_registry.get_data_source( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + resource=self.proxied_registry.get_data_source( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=AuthzedAction.DESCRIBE, ).to_proto() def ListDataSources( @@ -74,10 +125,16 @@ def ListDataSources( return RegistryServer_pb2.ListDataSourcesResponse( data_sources=[ data_source.to_proto() - for data_source in self.proxied_registry.list_data_sources( - project=request.project, - allow_cache=request.allow_cache, - tags=dict(request.tags), + for data_source in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_data_sources( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -85,6 +142,14 @@ def ListDataSources( def DeleteDataSource( self, request: RegistryServer_pb2.DeleteDataSourceRequest, context ): + assert_permissions( + resource=self.proxied_registry.get_data_source( + name=request.name, + project=request.project, + ), + actions=AuthzedAction.DELETE, + ) + self.proxied_registry.delete_data_source( name=request.name, project=request.project, commit=request.commit ) @@ -93,8 +158,13 @@ def DeleteDataSource( def GetFeatureView( self, request: RegistryServer_pb2.GetFeatureViewRequest, context ): - return self.proxied_registry.get_feature_view( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + self.proxied_registry.get_feature_view( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ApplyFeatureView( @@ -110,9 +180,17 @@ def ApplyFeatureView( elif feature_view_type == "stream_feature_view": feature_view = StreamFeatureView.from_proto(request.stream_feature_view) - self.proxied_registry.apply_feature_view( - feature_view=feature_view, project=request.project, commit=request.commit + ( + self.proxied_registry.apply_feature_view( + feature_view=cast( + FeatureView, + assert_permissions(resource=feature_view, actions=CRUD), + ), + project=request.project, + commit=request.commit, + ), ) + return Empty() def ListFeatureViews( @@ -121,10 +199,16 @@ def ListFeatureViews( return RegistryServer_pb2.ListFeatureViewsResponse( feature_views=[ feature_view.to_proto() - for feature_view in self.proxied_registry.list_feature_views( - project=request.project, - allow_cache=request.allow_cache, - tags=dict(request.tags), + for feature_view in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_feature_views( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -132,6 +216,21 @@ def ListFeatureViews( def DeleteFeatureView( self, request: RegistryServer_pb2.DeleteFeatureViewRequest, context ): + feature_view: Union[StreamFeatureView, FeatureView] + + try: + feature_view = self.proxied_registry.get_stream_feature_view( + name=request.name, project=request.project, allow_cache=False + ) + except FeatureViewNotFoundException: + feature_view = self.proxied_registry.get_feature_view( + name=request.name, project=request.project, allow_cache=False + ) + + assert_permissions( + resource=feature_view, + actions=[AuthzedAction.DELETE], + ) self.proxied_registry.delete_feature_view( name=request.name, project=request.project, commit=request.commit ) @@ -140,8 +239,13 @@ def DeleteFeatureView( def GetStreamFeatureView( self, request: RegistryServer_pb2.GetStreamFeatureViewRequest, context ): - return self.proxied_registry.get_stream_feature_view( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + resource=self.proxied_registry.get_stream_feature_view( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ListStreamFeatureViews( @@ -150,10 +254,16 @@ def ListStreamFeatureViews( return RegistryServer_pb2.ListStreamFeatureViewsResponse( stream_feature_views=[ stream_feature_view.to_proto() - for stream_feature_view in self.proxied_registry.list_stream_feature_views( - project=request.project, - allow_cache=request.allow_cache, - tags=dict(request.tags), + for stream_feature_view in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_stream_feature_views( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -161,8 +271,13 @@ def ListStreamFeatureViews( def GetOnDemandFeatureView( self, request: RegistryServer_pb2.GetOnDemandFeatureViewRequest, context ): - return self.proxied_registry.get_on_demand_feature_view( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + resource=self.proxied_registry.get_on_demand_feature_view( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ListOnDemandFeatureViews( @@ -171,10 +286,16 @@ def ListOnDemandFeatureViews( return RegistryServer_pb2.ListOnDemandFeatureViewsResponse( on_demand_feature_views=[ on_demand_feature_view.to_proto() - for on_demand_feature_view in self.proxied_registry.list_on_demand_feature_views( - project=request.project, - allow_cache=request.allow_cache, - tags=dict(request.tags), + for on_demand_feature_view in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_on_demand_feature_views( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -183,17 +304,29 @@ def ApplyFeatureService( self, request: RegistryServer_pb2.ApplyFeatureServiceRequest, context ): self.proxied_registry.apply_feature_service( - feature_service=FeatureService.from_proto(request.feature_service), + feature_service=cast( + FeatureService, + assert_permissions( + resource=FeatureService.from_proto(request.feature_service), + actions=CRUD, + ), + ), project=request.project, commit=request.commit, ) + return Empty() def GetFeatureService( self, request: RegistryServer_pb2.GetFeatureServiceRequest, context ): - return self.proxied_registry.get_feature_service( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + resource=self.proxied_registry.get_feature_service( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ListFeatureServices( @@ -202,10 +335,16 @@ def ListFeatureServices( return RegistryServer_pb2.ListFeatureServicesResponse( feature_services=[ feature_service.to_proto() - for feature_service in self.proxied_registry.list_feature_services( - project=request.project, - allow_cache=request.allow_cache, - tags=dict(request.tags), + for feature_service in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_feature_services( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -213,6 +352,15 @@ def ListFeatureServices( def DeleteFeatureService( self, request: RegistryServer_pb2.DeleteFeatureServiceRequest, context ): + ( + assert_permissions( + resource=self.proxied_registry.get_feature_service( + name=request.name, project=request.project + ), + actions=[AuthzedAction.DELETE], + ), + ) + self.proxied_registry.delete_feature_service( name=request.name, project=request.project, commit=request.commit ) @@ -221,18 +369,32 @@ def DeleteFeatureService( def ApplySavedDataset( self, request: RegistryServer_pb2.ApplySavedDatasetRequest, context ): - self.proxied_registry.apply_saved_dataset( - saved_dataset=SavedDataset.from_proto(request.saved_dataset), - project=request.project, - commit=request.commit, + ( + self.proxied_registry.apply_saved_dataset( + saved_dataset=cast( + SavedDataset, + assert_permissions( + resource=SavedDataset.from_proto(request.saved_dataset), + actions=CRUD, + ), + ), + project=request.project, + commit=request.commit, + ), ) + return Empty() def GetSavedDataset( self, request: RegistryServer_pb2.GetSavedDatasetRequest, context ): - return self.proxied_registry.get_saved_dataset( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + self.proxied_registry.get_saved_dataset( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ListSavedDatasets( @@ -241,8 +403,16 @@ def ListSavedDatasets( return RegistryServer_pb2.ListSavedDatasetsResponse( saved_datasets=[ saved_dataset.to_proto() - for saved_dataset in self.proxied_registry.list_saved_datasets( - project=request.project, allow_cache=request.allow_cache + for saved_dataset in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_saved_datasets( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -250,6 +420,13 @@ def ListSavedDatasets( def DeleteSavedDataset( self, request: RegistryServer_pb2.DeleteSavedDatasetRequest, context ): + assert_permissions( + resource=self.proxied_registry.get_saved_dataset( + name=request.name, project=request.project + ), + actions=[AuthzedAction.DELETE], + ) + self.proxied_registry.delete_saved_dataset( name=request.name, project=request.project, commit=request.commit ) @@ -259,19 +436,29 @@ def ApplyValidationReference( self, request: RegistryServer_pb2.ApplyValidationReferenceRequest, context ): self.proxied_registry.apply_validation_reference( - validation_reference=ValidationReference.from_proto( - request.validation_reference + validation_reference=cast( + ValidationReference, + assert_permissions( + ValidationReference.from_proto(request.validation_reference), + actions=CRUD, + ), ), project=request.project, commit=request.commit, ) + return Empty() def GetValidationReference( self, request: RegistryServer_pb2.GetValidationReferenceRequest, context ): - return self.proxied_registry.get_validation_reference( - name=request.name, project=request.project, allow_cache=request.allow_cache + return assert_permissions( + self.proxied_registry.get_validation_reference( + name=request.name, + project=request.project, + allow_cache=request.allow_cache, + ), + actions=[AuthzedAction.DESCRIBE], ).to_proto() def ListValidationReferences( @@ -280,8 +467,16 @@ def ListValidationReferences( return RegistryServer_pb2.ListValidationReferencesResponse( validation_references=[ validation_reference.to_proto() - for validation_reference in self.proxied_registry.list_validation_references( - project=request.project, allow_cache=request.allow_cache + for validation_reference in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_validation_references( + project=request.project, + allow_cache=request.allow_cache, + tags=dict(request.tags), + ), + ), + actions=AuthzedAction.DESCRIBE, ) ] ) @@ -289,6 +484,12 @@ def ListValidationReferences( def DeleteValidationReference( self, request: RegistryServer_pb2.DeleteValidationReferenceRequest, context ): + assert_permissions( + resource=self.proxied_registry.get_validation_reference( + name=request.name, project=request.project + ), + actions=[AuthzedAction.DELETE], + ) self.proxied_registry.delete_validation_reference( name=request.name, project=request.project, commit=request.commit ) @@ -309,6 +510,11 @@ def ListProjectMetadata( def ApplyMaterialization( self, request: RegistryServer_pb2.ApplyMaterializationRequest, context ): + assert_permissions( + resource=FeatureView.from_proto(request.feature_view), + actions=[AuthzedAction.WRITE_ONLINE], + ) + self.proxied_registry.apply_materialization( feature_view=FeatureView.from_proto(request.feature_view), project=request.project, @@ -336,6 +542,67 @@ def GetInfra(self, request: RegistryServer_pb2.GetInfraRequest, context): project=request.project, allow_cache=request.allow_cache ).to_proto() + def ApplyPermission( + self, request: RegistryServer_pb2.ApplyPermissionRequest, context + ): + self.proxied_registry.apply_permission( + permission=cast( + Permission, + assert_permissions( + Permission.from_proto(request.permission), actions=CRUD + ), + ), + project=request.project, + commit=request.commit, + ) + return Empty() + + def GetPermission(self, request: RegistryServer_pb2.GetPermissionRequest, context): + permission = self.proxied_registry.get_permission( + name=request.name, project=request.project, allow_cache=request.allow_cache + ) + assert_permissions( + resource=permission, + actions=[AuthzedAction.DESCRIBE], + ) + permission.to_proto().spec.project = request.project + + return permission.to_proto() + + def ListPermissions( + self, request: RegistryServer_pb2.ListPermissionsRequest, context + ): + return RegistryServer_pb2.ListPermissionsResponse( + permissions=[ + permission.to_proto() + for permission in permitted_resources( + resources=cast( + list[FeastObject], + self.proxied_registry.list_permissions( + project=request.project, allow_cache=request.allow_cache + ), + ), + actions=AuthzedAction.DESCRIBE, + ) + ] + ) + + def DeletePermission( + self, request: RegistryServer_pb2.DeletePermissionRequest, context + ): + assert_permissions( + resource=self.proxied_registry.get_permission( + name=request.name, + project=request.project, + ), + actions=[AuthzedAction.DELETE], + ) + + self.proxied_registry.delete_permission( + name=request.name, project=request.project, commit=request.commit + ) + return Empty() + def Commit(self, request, context): self.proxied_registry.commit() return Empty() @@ -348,11 +615,25 @@ def Proto(self, request, context): return self.proxied_registry.proto() -def start_server(store: FeatureStore, port: int): - server = grpc.server(futures.ThreadPoolExecutor(max_workers=10)) +def start_server(store: FeatureStore, port: int, wait_for_termination: bool = True): + auth_manager_type = str_to_auth_manager_type(store.config.auth_config.type) + init_security_manager(auth_type=auth_manager_type, fs=store) + init_auth_manager( + auth_type=auth_manager_type, + server_type=ServerType.GRPC, + auth_config=store.config.auth_config, + ) + + server = grpc.server( + futures.ThreadPoolExecutor(max_workers=10), + interceptors=grpc_interceptors(auth_manager_type), + ) RegistryServer_pb2_grpc.add_RegistryServerServicer_to_server( RegistryServer(store.registry), server ) server.add_insecure_port(f"[::]:{port}") server.start() - server.wait_for_termination() + if wait_for_termination: + server.wait_for_termination() + else: + return server diff --git a/sdk/python/feast/repo_config.py b/sdk/python/feast/repo_config.py index fc2792e323..069b579999 100644 --- a/sdk/python/feast/repo_config.py +++ b/sdk/python/feast/repo_config.py @@ -19,12 +19,14 @@ from feast.errors import ( FeastFeatureServerTypeInvalidError, + FeastInvalidAuthConfigClass, FeastOfflineStoreInvalidName, FeastOnlineStoreInvalidName, FeastRegistryNotSetError, FeastRegistryTypeInvalidError, ) from feast.importer import import_class +from feast.permissions.auth.auth_type import AuthType warnings.simplefilter("once", RuntimeWarning) @@ -86,6 +88,12 @@ "local": "feast.infra.feature_servers.local_process.config.LocalFeatureServerConfig", } +AUTH_CONFIGS_CLASS_FOR_TYPE = { + "no_auth": "feast.permissions.auth_model.NoAuthConfig", + "kubernetes": "feast.permissions.auth_model.KubernetesAuthConfig", + "oidc": "feast.permissions.auth_model.OidcAuthConfig", +} + class FeastBaseModel(BaseModel): """Feast Pydantic Configuration Class""" @@ -167,6 +175,9 @@ class RepoConfig(FeastBaseModel): online_config: Any = Field(None, alias="online_store") """ OnlineStoreConfig: Online store configuration (optional depending on provider) """ + auth: Any = Field(None, alias="auth") + """ auth: Optional if the services needs the authentication against IDPs (optional depending on provider) """ + offline_config: Any = Field(None, alias="offline_store") """ OfflineStoreConfig: Offline store configuration (optional depending on provider) """ @@ -211,6 +222,13 @@ def __init__(self, **data: Any): self._online_store = None self.online_config = data.get("online_store", "sqlite") + self._auth = None + if "auth" not in data: + self.auth = dict() + self.auth["type"] = AuthType.NONE.value + else: + self.auth = data.get("auth") + self._batch_engine = None if "batch_engine" in data: self.batch_engine_config = data["batch_engine"] @@ -270,6 +288,20 @@ def offline_store(self): self._offline_store = self.offline_config return self._offline_store + @property + def auth_config(self): + if not self._auth: + if isinstance(self.auth, Dict): + self._auth = get_auth_config_from_type(self.auth.get("type"))( + **self.auth + ) + elif isinstance(self.auth, str): + self._auth = get_auth_config_from_type(self.auth.get("type"))() + elif self.auth: + self._auth = self.auth + + return self._auth + @property def online_store(self): if not self._online_store: @@ -300,6 +332,30 @@ def batch_engine(self): return self._batch_engine + @model_validator(mode="before") + def _validate_auth_config(cls, values: Any) -> Any: + from feast.permissions.auth_model import AuthConfig + + if "auth" in values: + allowed_auth_types = AUTH_CONFIGS_CLASS_FOR_TYPE.keys() + if isinstance(values["auth"], Dict): + if values["auth"].get("type") is None: + raise ValueError( + f"auth configuration is missing authentication type. Possible values={allowed_auth_types}" + ) + elif values["auth"]["type"] not in allowed_auth_types: + raise ValueError( + f'auth configuration has invalid authentication type={values["auth"]["type"]}. Possible ' + f'values={allowed_auth_types}' + ) + elif isinstance(values["auth"], AuthConfig): + if values["auth"].type not in allowed_auth_types: + raise ValueError( + f'auth configuration has invalid authentication type={values["auth"].type}. Possible ' + f'values={allowed_auth_types}' + ) + return values + @model_validator(mode="before") def _validate_online_store_config(cls, values: Any) -> Any: # This method will validate whether the online store configurations are set correctly. This explicit validation @@ -480,6 +536,17 @@ def get_online_config_from_type(online_store_type: str): return import_class(module_name, config_class_name, config_class_name) +def get_auth_config_from_type(auth_config_type: str): + if auth_config_type in AUTH_CONFIGS_CLASS_FOR_TYPE: + auth_config_type = AUTH_CONFIGS_CLASS_FOR_TYPE[auth_config_type] + elif not auth_config_type.endswith("AuthConfig"): + raise FeastInvalidAuthConfigClass(auth_config_type) + module_name, online_store_class_type = auth_config_type.rsplit(".", 1) + config_class_name = f"{online_store_class_type}" + + return import_class(module_name, config_class_name, config_class_name) + + def get_offline_config_from_type(offline_store_type: str): if offline_store_type in OFFLINE_STORE_CLASS_FOR_TYPE: offline_store_type = OFFLINE_STORE_CLASS_FOR_TYPE[offline_store_type] diff --git a/sdk/python/feast/repo_contents.py b/sdk/python/feast/repo_contents.py index 33b99f29b2..9893d5be4e 100644 --- a/sdk/python/feast/repo_contents.py +++ b/sdk/python/feast/repo_contents.py @@ -18,6 +18,7 @@ from feast.feature_service import FeatureService from feast.feature_view import FeatureView from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.protos.feast.core.Registry_pb2 import Registry as RegistryProto from feast.stream_feature_view import StreamFeatureView @@ -33,6 +34,7 @@ class RepoContents(NamedTuple): stream_feature_views: List[StreamFeatureView] entities: List[Entity] feature_services: List[FeatureService] + permissions: List[Permission] def to_registry_proto(self) -> RegistryProto: registry_proto = RegistryProto() @@ -50,4 +52,6 @@ def to_registry_proto(self) -> RegistryProto: registry_proto.stream_feature_views.extend( [fv.to_proto() for fv in self.stream_feature_views] ) + registry_proto.permissions.extend([p.to_proto() for p in self.permissions]) + return registry_proto diff --git a/sdk/python/feast/repo_operations.py b/sdk/python/feast/repo_operations.py index 0a89ab72ca..cb27568957 100644 --- a/sdk/python/feast/repo_operations.py +++ b/sdk/python/feast/repo_operations.py @@ -27,6 +27,7 @@ from feast.infra.registry.registry import FEAST_OBJECT_TYPES, FeastObjectType, Registry from feast.names import adjectives, animals from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.permission import Permission from feast.repo_config import RepoConfig from feast.repo_contents import RepoContents from feast.stream_feature_view import StreamFeatureView @@ -120,6 +121,7 @@ def parse_repo(repo_root: Path) -> RepoContents: feature_services=[], on_demand_feature_views=[], stream_feature_views=[], + permissions=[], ) for repo_file in get_repo_files(repo_root): @@ -201,6 +203,10 @@ def parse_repo(repo_root: Path) -> RepoContents: (obj is odfv) for odfv in res.on_demand_feature_views ): res.on_demand_feature_views.append(obj) + elif isinstance(obj, Permission) and not any( + (obj is p) for p in res.permissions + ): + res.permissions.append(obj) res.entities.append(DUMMY_ENTITY) return res @@ -367,7 +373,12 @@ def registry_dump(repo_config: RepoConfig, repo_path: Path) -> str: """For debugging only: output contents of the metadata registry""" registry_config = repo_config.registry project = repo_config.project - registry = Registry(project, registry_config=registry_config, repo_path=repo_path) + registry = Registry( + project, + registry_config=registry_config, + repo_path=repo_path, + auth_config=repo_config.auth_config, + ) registry_dict = registry.to_dict(project=project) return json.dumps(registry_dict, indent=2, sort_keys=True) diff --git a/sdk/python/feast/templates/local/feature_repo/feature_store.yaml b/sdk/python/feast/templates/local/feature_repo/feature_store.yaml index 3e6a360316..11b339583e 100644 --- a/sdk/python/feast/templates/local/feature_repo/feature_store.yaml +++ b/sdk/python/feast/templates/local/feature_repo/feature_store.yaml @@ -7,3 +7,6 @@ online_store: type: sqlite path: data/online_store.db entity_key_serialization_version: 2 +# By default, no_auth for authentication and authorization, other possible values kubernetes and oidc. Refer the documentation for more details. +auth: + type: no_auth diff --git a/sdk/python/feast/templates/minimal/feature_repo/feature_store.yaml b/sdk/python/feast/templates/minimal/feature_repo/feature_store.yaml index 9808690005..45a0ce7718 100644 --- a/sdk/python/feast/templates/minimal/feature_repo/feature_store.yaml +++ b/sdk/python/feast/templates/minimal/feature_repo/feature_store.yaml @@ -3,4 +3,4 @@ registry: /path/to/registry.db provider: local online_store: path: /path/to/online_store.db -entity_key_serialization_version: 2 +entity_key_serialization_version: 2 \ No newline at end of file diff --git a/sdk/python/requirements/py3.10-ci-requirements.txt b/sdk/python/requirements/py3.10-ci-requirements.txt index 785342550a..89459d1a69 100644 --- a/sdk/python/requirements/py3.10-ci-requirements.txt +++ b/sdk/python/requirements/py3.10-ci-requirements.txt @@ -40,6 +40,8 @@ async-timeout==4.0.3 # via # aiohttp # redis +async-property==0.2.2 + # via python-keycloak atpublic==4.1.0 # via ibis-framework attrs==23.2.0 @@ -63,6 +65,8 @@ beautifulsoup4==4.12.3 # via nbconvert bidict==0.23.1 # via ibis-framework +bigtree==0.19.2 + # via feast (setup.py) bleach==6.1.0 # via nbconvert boto3==1.34.131 @@ -131,6 +135,7 @@ cryptography==42.0.8 # azure-identity # azure-storage-blob # great-expectations + # jwcrypto # moto # msal # pyjwt @@ -154,6 +159,8 @@ defusedxml==0.7.1 # via nbconvert deltalake==0.18.1 # via feast (setup.py) +deprecation==2.1.0 + # via python-keycloak dill==0.3.8 # via feast (setup.py) distlib==0.3.8 @@ -300,6 +307,7 @@ httpx==0.27.0 # feast (setup.py) # fastapi # jupyterlab + # python-keycloak ibis-framework[duckdb]==9.1.0 # via # feast (setup.py) @@ -409,6 +417,8 @@ jupyterlab-server==2.27.2 # notebook jupyterlab-widgets==3.0.11 # via ipywidgets +jwcrypto==1.5.6 + # via python-keycloak kubernetes==20.13.0 # via feast (setup.py) locket==1.0.0 @@ -502,6 +512,7 @@ packaging==24.1 # build # dask # db-dtypes + # deprecation # google-cloud-bigquery # great-expectations # gunicorn @@ -712,6 +723,8 @@ python-dotenv==1.0.1 # via uvicorn python-json-logger==2.0.7 # via jupyter-events +python-keycloak==4.2.2 + # via feast (setup.py) python-multipart==0.0.9 # via fastapi pytz==2024.1 @@ -760,7 +773,9 @@ requests==2.32.3 # kubernetes # moto # msal + # python-keycloak # requests-oauthlib + # requests-toolbelt # responses # singlestoredb # snowflake-connector-python @@ -768,6 +783,8 @@ requests==2.32.3 # trino requests-oauthlib==2.0.0 # via kubernetes +requests-toolbelt==1.0.0 + # via python-keycloak responses==0.25.3 # via moto rfc3339-validator==0.1.4 @@ -972,6 +989,7 @@ typing-extensions==4.12.2 # great-expectations # ibis-framework # ipython + # jwcrypto # mypy # psycopg # psycopg-pool @@ -1050,3 +1068,4 @@ yarl==1.9.4 # via aiohttp zipp==3.19.1 # via importlib-metadata +bigtree==0.19.2 diff --git a/sdk/python/requirements/py3.10-requirements.txt b/sdk/python/requirements/py3.10-requirements.txt index 250e617b85..f1ec2b16ab 100644 --- a/sdk/python/requirements/py3.10-requirements.txt +++ b/sdk/python/requirements/py3.10-requirements.txt @@ -222,3 +222,4 @@ websockets==12.0 # via uvicorn zipp==3.19.1 # via importlib-metadata +bigtree==0.19.2 diff --git a/sdk/python/requirements/py3.11-ci-requirements.txt b/sdk/python/requirements/py3.11-ci-requirements.txt index f16b486aa5..fd0b5a6d26 100644 --- a/sdk/python/requirements/py3.11-ci-requirements.txt +++ b/sdk/python/requirements/py3.11-ci-requirements.txt @@ -36,6 +36,8 @@ asttokens==2.4.1 # via stack-data async-lru==2.0.4 # via jupyterlab +async-property==0.2.2 + # via python-keycloak atpublic==4.1.0 # via ibis-framework attrs==23.2.0 @@ -59,6 +61,8 @@ beautifulsoup4==4.12.3 # via nbconvert bidict==0.23.1 # via ibis-framework +bigtree==0.19.2 + # via feast (setup.py) bleach==6.1.0 # via nbconvert boto3==1.34.131 @@ -127,6 +131,7 @@ cryptography==42.0.8 # azure-identity # azure-storage-blob # great-expectations + # jwcrypto # moto # msal # pyjwt @@ -150,6 +155,8 @@ defusedxml==0.7.1 # via nbconvert deltalake==0.18.1 # via feast (setup.py) +deprecation==2.1.0 + # via python-keycloak dill==0.3.8 # via feast (setup.py) distlib==0.3.8 @@ -291,6 +298,7 @@ httpx==0.27.0 # feast (setup.py) # fastapi # jupyterlab + # python-keycloak ibis-framework[duckdb]==9.1.0 # via # feast (setup.py) @@ -400,6 +408,8 @@ jupyterlab-server==2.27.2 # notebook jupyterlab-widgets==3.0.11 # via ipywidgets +jwcrypto==1.5.6 + # via python-keycloak kubernetes==20.13.0 # via feast (setup.py) locket==1.0.0 @@ -493,6 +503,7 @@ packaging==24.1 # build # dask # db-dtypes + # deprecation # google-cloud-bigquery # great-expectations # gunicorn @@ -703,6 +714,8 @@ python-dotenv==1.0.1 # via uvicorn python-json-logger==2.0.7 # via jupyter-events +python-keycloak==4.2.2 + # via feast (setup.py) python-multipart==0.0.9 # via fastapi pytz==2024.1 @@ -751,7 +764,9 @@ requests==2.32.3 # kubernetes # moto # msal + # python-keycloak # requests-oauthlib + # requests-toolbelt # responses # singlestoredb # snowflake-connector-python @@ -759,6 +774,8 @@ requests==2.32.3 # trino requests-oauthlib==2.0.0 # via kubernetes +requests-toolbelt==1.0.0 + # via python-keycloak responses==0.25.3 # via moto rfc3339-validator==0.1.4 @@ -951,6 +968,7 @@ typing-extensions==4.12.2 # great-expectations # ibis-framework # ipython + # jwcrypto # mypy # psycopg # psycopg-pool diff --git a/sdk/python/requirements/py3.11-requirements.txt b/sdk/python/requirements/py3.11-requirements.txt index 4f1655de09..e51452a594 100644 --- a/sdk/python/requirements/py3.11-requirements.txt +++ b/sdk/python/requirements/py3.11-requirements.txt @@ -216,3 +216,4 @@ websockets==12.0 # via uvicorn zipp==3.19.1 # via importlib-metadata +bigtree==0.19.2 diff --git a/sdk/python/requirements/py3.9-ci-requirements.txt b/sdk/python/requirements/py3.9-ci-requirements.txt index 94bfa82058..be30f032a9 100644 --- a/sdk/python/requirements/py3.9-ci-requirements.txt +++ b/sdk/python/requirements/py3.9-ci-requirements.txt @@ -40,6 +40,8 @@ async-timeout==4.0.3 # via # aiohttp # redis +async-property==0.2.2 + # via python-keycloak atpublic==4.1.0 # via ibis-framework attrs==23.2.0 @@ -63,6 +65,8 @@ beautifulsoup4==4.12.3 # via nbconvert bidict==0.23.1 # via ibis-framework +bigtree==0.19.2 + # via feast (setup.py) bleach==6.1.0 # via nbconvert boto3==1.34.131 @@ -131,6 +135,7 @@ cryptography==42.0.8 # azure-identity # azure-storage-blob # great-expectations + # jwcrypto # moto # msal # pyjwt @@ -154,6 +159,8 @@ defusedxml==0.7.1 # via nbconvert deltalake==0.18.1 # via feast (setup.py) +deprecation==2.1.0 + # via python-keycloak dill==0.3.8 # via feast (setup.py) distlib==0.3.8 @@ -300,6 +307,7 @@ httpx==0.27.0 # feast (setup.py) # fastapi # jupyterlab + # python-keycloak ibis-framework[duckdb]==9.0.0 # via # feast (setup.py) @@ -418,6 +426,8 @@ jupyterlab-server==2.27.2 # notebook jupyterlab-widgets==3.0.11 # via ipywidgets +jwcrypto==1.5.6 + # via python-keycloak kubernetes==20.13.0 # via feast (setup.py) locket==1.0.0 @@ -511,6 +521,7 @@ packaging==24.1 # build # dask # db-dtypes + # deprecation # google-cloud-bigquery # great-expectations # gunicorn @@ -721,6 +732,8 @@ python-dotenv==1.0.1 # via uvicorn python-json-logger==2.0.7 # via jupyter-events +python-keycloak==4.2.2 + # via feast (setup.py) python-multipart==0.0.9 # via fastapi pytz==2024.1 @@ -769,7 +782,9 @@ requests==2.32.3 # kubernetes # moto # msal + # python-keycloak # requests-oauthlib + # requests-toolbelt # responses # singlestoredb # snowflake-connector-python @@ -777,6 +792,8 @@ requests==2.32.3 # trino requests-oauthlib==2.0.0 # via kubernetes +requests-toolbelt==1.0.0 + # via python-keycloak responses==0.25.3 # via moto rfc3339-validator==0.1.4 @@ -984,6 +1001,7 @@ typing-extensions==4.12.2 # great-expectations # ibis-framework # ipython + # jwcrypto # mypy # psycopg # psycopg-pool diff --git a/sdk/python/requirements/py3.9-requirements.txt b/sdk/python/requirements/py3.9-requirements.txt index f9fa856a0e..0b3c8a33c9 100644 --- a/sdk/python/requirements/py3.9-requirements.txt +++ b/sdk/python/requirements/py3.9-requirements.txt @@ -225,3 +225,4 @@ websockets==12.0 # via uvicorn zipp==3.19.2 # via importlib-metadata +bigtree==0.19.2 diff --git a/sdk/python/tests/conftest.py b/sdk/python/tests/conftest.py index 1fd510d104..74aa68e984 100644 --- a/sdk/python/tests/conftest.py +++ b/sdk/python/tests/conftest.py @@ -15,9 +15,11 @@ import multiprocessing import os import random +import tempfile from datetime import timedelta from multiprocessing import Process from sys import platform +from textwrap import dedent from typing import Any, Dict, List, Tuple, no_type_check from unittest import mock @@ -29,8 +31,8 @@ from feast.feature_store import FeatureStore # noqa: E402 from feast.utils import _utc_now from feast.wait import wait_retry_backoff # noqa: E402 -from tests.data.data_creator import ( # noqa: E402 - create_basic_driver_dataset, +from tests.data.data_creator import ( + create_basic_driver_dataset, # noqa: E402 create_document_dataset, ) from tests.integration.feature_repos.integration_test_repo_config import ( @@ -54,6 +56,7 @@ driver, location, ) +from tests.utils.auth_permissions_util import default_store from tests.utils.http_server import check_port_open, free_port # noqa: E402 logger = logging.getLogger(__name__) @@ -406,3 +409,75 @@ def fake_document_data(environment: Environment) -> Tuple[pd.DataFrame, DataSour environment.feature_store.project, ) return df, data_source + + +@pytest.fixture +def temp_dir(): + with tempfile.TemporaryDirectory() as temp_dir: + print(f"Created {temp_dir}") + yield temp_dir + + +@pytest.fixture +def server_port(): + return free_port() + + +@pytest.fixture +def feature_store(temp_dir, auth_config, applied_permissions): + print(f"Creating store at {temp_dir}") + return default_store(str(temp_dir), auth_config, applied_permissions) + + +@pytest.fixture(scope="module") +def all_markers_from_module(request): + markers = set() + for item in request.session.items: + for marker in item.iter_markers(): + markers.add(marker.name) + + return markers + + +@pytest.fixture(scope="module") +def is_integration_test(all_markers_from_module): + return "integration" in all_markers_from_module + + +@pytest.fixture( + scope="module", + params=[ + dedent(""" + auth: + type: no_auth + """), + dedent(""" + auth: + type: kubernetes + """), + dedent(""" + auth: + type: oidc + client_id: feast-integration-client + client_secret: feast-integration-client-secret + username: reader_writer + password: password + realm: master + auth_server_url: KEYCLOAK_URL_PLACE_HOLDER + auth_discovery_url: KEYCLOAK_URL_PLACE_HOLDER/realms/master/.well-known/openid-configuration + """), + ], +) +def auth_config(request, is_integration_test): + auth_configuration = request.param + + if is_integration_test: + if "kubernetes" in auth_configuration: + pytest.skip( + "skipping integration tests for kubernetes platform, unit tests are covering this functionality." + ) + elif "oidc" in auth_configuration: + keycloak_url = request.getfixturevalue("start_keycloak_server") + return auth_configuration.replace("KEYCLOAK_URL_PLACE_HOLDER", keycloak_url) + + return auth_configuration diff --git a/sdk/python/tests/integration/conftest.py b/sdk/python/tests/integration/conftest.py new file mode 100644 index 0000000000..5c34a448e2 --- /dev/null +++ b/sdk/python/tests/integration/conftest.py @@ -0,0 +1,16 @@ +import logging + +import pytest +from testcontainers.keycloak import KeycloakContainer + +from tests.utils.auth_permissions_util import setup_permissions_on_keycloak + +logger = logging.getLogger(__name__) + + +@pytest.fixture(scope="session") +def start_keycloak_server(): + logger.info("Starting keycloak instance") + with KeycloakContainer("quay.io/keycloak/keycloak:24.0.1") as keycloak_container: + setup_permissions_on_keycloak(keycloak_container.get_client()) + yield keycloak_container.get_url() diff --git a/sdk/python/tests/integration/feature_repos/repo_configuration.py b/sdk/python/tests/integration/feature_repos/repo_configuration.py index 48f5070f1e..235c909d5f 100644 --- a/sdk/python/tests/integration/feature_repos/repo_configuration.py +++ b/sdk/python/tests/integration/feature_repos/repo_configuration.py @@ -11,15 +11,26 @@ import pandas as pd import pytest -from feast import FeatureStore, FeatureView, OnDemandFeatureView, driver_test_data +from feast import ( + FeatureStore, + FeatureView, + OnDemandFeatureView, + StreamFeatureView, + driver_test_data, +) from feast.constants import FULL_REPO_CONFIGS_MODULE_ENV_NAME from feast.data_source import DataSource from feast.errors import FeastModuleImportError +from feast.feature_service import FeatureService from feast.infra.feature_servers.base_config import ( BaseFeatureServerConfig, FeatureLoggingConfig, ) from feast.infra.feature_servers.local_process.config import LocalFeatureServerConfig +from feast.permissions.action import AuthzedAction +from feast.permissions.auth_model import OidcAuthConfig +from feast.permissions.permission import Permission +from feast.permissions.policy import RoleBasedPolicy from feast.repo_config import RegistryConfig, RepoConfig from feast.utils import _utc_now from tests.integration.feature_repos.integration_test_repo_config import ( @@ -36,6 +47,7 @@ DuckDBDataSourceCreator, DuckDBDeltaDataSourceCreator, FileDataSourceCreator, + RemoteOfflineOidcAuthStoreDataSourceCreator, RemoteOfflineStoreDataSourceCreator, ) from tests.integration.feature_repos.universal.data_sources.redshift import ( @@ -124,6 +136,7 @@ ("local", DuckDBDataSourceCreator), ("local", DuckDBDeltaDataSourceCreator), ("local", RemoteOfflineStoreDataSourceCreator), + ("local", RemoteOfflineOidcAuthStoreDataSourceCreator), ] if os.getenv("FEAST_IS_LOCAL_TEST", "False") == "True": @@ -134,7 +147,6 @@ ] ) - AVAILABLE_ONLINE_STORES: Dict[ str, Tuple[Union[str, Dict[Any, Any]], Optional[Type[OnlineStoreCreator]]] ] = {"sqlite": ({"type": "sqlite"}, None)} @@ -164,7 +176,6 @@ # containerized version of IKV. # AVAILABLE_ONLINE_STORES["ikv"] = (IKV_CONFIG, None) - full_repo_configs_module = os.environ.get(FULL_REPO_CONFIGS_MODULE_ENV_NAME) if full_repo_configs_module is not None: try: @@ -200,7 +211,6 @@ for c in FULL_REPO_CONFIGS } - # Replace online stores with emulated online stores if we're running local integration tests if os.getenv("FEAST_LOCAL_ONLINE_CONTAINER", "False").lower() == "true": replacements: Dict[ @@ -432,6 +442,7 @@ def setup(self): feature_server=self.feature_server, entity_key_serialization_version=self.entity_key_serialization_version, ) + self.feature_store = FeatureStore(config=self.config) def teardown(self): @@ -441,6 +452,73 @@ def teardown(self): self.online_store_creator.teardown() +@dataclass +class OfflineServerPermissionsEnvironment(Environment): + def setup(self): + self.data_source_creator.setup(self.registry) + keycloak_url = self.data_source_creator.get_keycloak_url() + auth_config = OidcAuthConfig( + client_id="feast-integration-client", + client_secret="feast-integration-client-secret", + username="reader_writer", + password="password", + realm="master", + type="oidc", + auth_server_url=keycloak_url, + auth_discovery_url=f"{keycloak_url}/realms/master/.well-known" + f"/openid-configuration", + ) + self.config = RepoConfig( + registry=self.registry, + project=self.project, + provider=self.provider, + offline_store=self.data_source_creator.create_offline_store_config(), + online_store=self.online_store_creator.create_online_store() + if self.online_store_creator + else self.online_store, + batch_engine=self.batch_engine, + repo_path=self.repo_dir_name, + feature_server=self.feature_server, + entity_key_serialization_version=self.entity_key_serialization_version, + auth=auth_config, + ) + + self.feature_store = FeatureStore(config=self.config) + permissions_list = [ + Permission( + name="offline_fv_perm", + types=FeatureView, + policy=RoleBasedPolicy(roles=["writer"]), + actions=[AuthzedAction.READ_OFFLINE, AuthzedAction.WRITE_OFFLINE], + ), + Permission( + name="offline_odfv_perm", + types=OnDemandFeatureView, + policy=RoleBasedPolicy(roles=["writer"]), + actions=[AuthzedAction.READ_OFFLINE, AuthzedAction.WRITE_OFFLINE], + ), + Permission( + name="offline_sfv_perm", + types=StreamFeatureView, + policy=RoleBasedPolicy(roles=["writer"]), + actions=[AuthzedAction.READ_OFFLINE, AuthzedAction.WRITE_OFFLINE], + ), + Permission( + name="offline_fs_perm", + types=FeatureService, + policy=RoleBasedPolicy(roles=["writer"]), + actions=[AuthzedAction.READ_OFFLINE, AuthzedAction.WRITE_OFFLINE], + ), + Permission( + name="offline_datasource_perm", + types=DataSource, + policy=RoleBasedPolicy(roles=["writer"]), + actions=[AuthzedAction.READ_OFFLINE, AuthzedAction.WRITE_OFFLINE], + ), + ] + self.feature_store.apply(permissions_list) + + def table_name_from_data_source(ds: DataSource) -> Optional[str]: if hasattr(ds, "table_ref"): return ds.table_ref # type: ignore @@ -491,23 +569,27 @@ def construct_test_environment( cache_ttl_seconds=1, ) - environment = Environment( - name=project, - provider=test_repo_config.provider, - data_source_creator=offline_creator, - python_feature_server=test_repo_config.python_feature_server, - worker_id=worker_id, - online_store_creator=online_creator, - fixture_request=fixture_request, - project=project, - registry=registry, - feature_server=feature_server, - entity_key_serialization_version=entity_key_serialization_version, - repo_dir_name=repo_dir_name, - batch_engine=test_repo_config.batch_engine, - online_store=test_repo_config.online_store, - ) + environment_params = { + "name": project, + "provider": test_repo_config.provider, + "data_source_creator": offline_creator, + "python_feature_server": test_repo_config.python_feature_server, + "worker_id": worker_id, + "online_store_creator": online_creator, + "fixture_request": fixture_request, + "project": project, + "registry": registry, + "feature_server": feature_server, + "entity_key_serialization_version": entity_key_serialization_version, + "repo_dir_name": repo_dir_name, + "batch_engine": test_repo_config.batch_engine, + "online_store": test_repo_config.online_store, + } + if not isinstance(offline_creator, RemoteOfflineOidcAuthStoreDataSourceCreator): + environment = Environment(**environment_params) + else: + environment = OfflineServerPermissionsEnvironment(**environment_params) return environment diff --git a/sdk/python/tests/integration/feature_repos/universal/data_sources/file.py b/sdk/python/tests/integration/feature_repos/universal/data_sources/file.py index e505986350..b600699f81 100644 --- a/sdk/python/tests/integration/feature_repos/universal/data_sources/file.py +++ b/sdk/python/tests/integration/feature_repos/universal/data_sources/file.py @@ -33,6 +33,7 @@ from tests.integration.feature_repos.universal.data_source_creator import ( DataSourceCreator, ) +from tests.utils.auth_permissions_util import include_auth_config from tests.utils.http_server import check_port_open, free_port # noqa: E402 logger = logging.getLogger(__name__) @@ -428,3 +429,95 @@ def teardown(self): ), timeout_secs=30, ) + + +class RemoteOfflineOidcAuthStoreDataSourceCreator(FileDataSourceCreator): + def __init__(self, project_name: str, *args, **kwargs): + super().__init__(project_name) + if "fixture_request" in kwargs: + request = kwargs["fixture_request"] + self.keycloak_url = request.getfixturevalue("start_keycloak_server") + else: + raise RuntimeError( + "fixture_request object is not passed to inject keycloak fixture dynamically." + ) + auth_config_template = """ +auth: + type: oidc + client_id: feast-integration-client + client_secret: feast-integration-client-secret + username: reader_writer + password: password + realm: master + auth_server_url: {keycloak_url} + auth_discovery_url: {keycloak_url}/realms/master/.well-known/openid-configuration +""" + self.auth_config = auth_config_template.format(keycloak_url=self.keycloak_url) + self.server_port: int = 0 + self.proc = None + + def setup(self, registry: RegistryConfig): + parent_offline_config = super().create_offline_store_config() + config = RepoConfig( + project=self.project_name, + provider="local", + offline_store=parent_offline_config, + registry=registry.path, + entity_key_serialization_version=2, + ) + + repo_path = Path(tempfile.mkdtemp()) + with open(repo_path / "feature_store.yaml", "w") as outfile: + yaml.dump(config.model_dump(by_alias=True), outfile) + repo_path = str(repo_path.resolve()) + + include_auth_config( + file_path=f"{repo_path}/feature_store.yaml", auth_config=self.auth_config + ) + + self.server_port = free_port() + host = "0.0.0.0" + cmd = [ + "feast", + "-c" + repo_path, + "serve_offline", + "--host", + host, + "--port", + str(self.server_port), + ] + self.proc = subprocess.Popen( + cmd, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL + ) + + _time_out_sec: int = 60 + # Wait for server to start + wait_retry_backoff( + lambda: (None, check_port_open(host, self.server_port)), + timeout_secs=_time_out_sec, + timeout_msg=f"Unable to start the feast remote offline server in {_time_out_sec} seconds at port={self.server_port}", + ) + return "grpc+tcp://{}:{}".format(host, self.server_port) + + def create_offline_store_config(self) -> FeastConfigBaseModel: + self.remote_offline_store_config = RemoteOfflineStoreConfig( + type="remote", host="0.0.0.0", port=self.server_port + ) + return self.remote_offline_store_config + + def get_keycloak_url(self): + return self.keycloak_url + + def teardown(self): + super().teardown() + if self.proc is not None: + self.proc.kill() + + # wait server to free the port + wait_retry_backoff( + lambda: ( + None, + not check_port_open("localhost", self.server_port), + ), + timeout_secs=30, + ) diff --git a/sdk/python/tests/integration/offline_store/test_universal_historical_retrieval.py b/sdk/python/tests/integration/offline_store/test_universal_historical_retrieval.py index ecaa5f40db..97ad54251f 100644 --- a/sdk/python/tests/integration/offline_store/test_universal_historical_retrieval.py +++ b/sdk/python/tests/integration/offline_store/test_universal_historical_retrieval.py @@ -21,6 +21,7 @@ table_name_from_data_source, ) from tests.integration.feature_repos.universal.data_sources.file import ( + RemoteOfflineOidcAuthStoreDataSourceCreator, RemoteOfflineStoreDataSourceCreator, ) from tests.integration.feature_repos.universal.data_sources.snowflake import ( @@ -162,7 +163,11 @@ def test_historical_features_main( ) if not isinstance( - environment.data_source_creator, RemoteOfflineStoreDataSourceCreator + environment.data_source_creator, + ( + RemoteOfflineStoreDataSourceCreator, + RemoteOfflineOidcAuthStoreDataSourceCreator, + ), ): assert_feature_service_correctness( store, diff --git a/sdk/python/tests/integration/online_store/test_remote_online_store.py b/sdk/python/tests/integration/online_store/test_remote_online_store.py index 21ac00583b..f74fb14a86 100644 --- a/sdk/python/tests/integration/online_store/test_remote_online_store.py +++ b/sdk/python/tests/integration/online_store/test_remote_online_store.py @@ -1,28 +1,59 @@ import os -import subprocess import tempfile from textwrap import dedent import pytest +from feast import FeatureView, OnDemandFeatureView, StreamFeatureView from feast.feature_store import FeatureStore -from feast.utils import _utc_now -from feast.wait import wait_retry_backoff +from feast.permissions.action import AuthzedAction +from feast.permissions.permission import Permission +from feast.permissions.policy import RoleBasedPolicy +from tests.utils.auth_permissions_util import ( + PROJECT_NAME, + default_store, + start_feature_server, +) from tests.utils.cli_repo_creator import CliRunner -from tests.utils.http_server import check_port_open, free_port +from tests.utils.http_server import free_port @pytest.mark.integration -def test_remote_online_store_read(): +def test_remote_online_store_read(auth_config): with tempfile.TemporaryDirectory() as remote_server_tmp_dir, tempfile.TemporaryDirectory() as remote_client_tmp_dir: + permissions_list = [ + Permission( + name="online_list_fv_perm", + types=FeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.READ_ONLINE], + ), + Permission( + name="online_list_odfv_perm", + types=OnDemandFeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.READ_ONLINE], + ), + Permission( + name="online_list_sfv_perm", + types=StreamFeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.READ_ONLINE], + ), + ] server_store, server_url, registry_path = ( - _create_server_store_spin_feature_server(temp_dir=remote_server_tmp_dir) + _create_server_store_spin_feature_server( + temp_dir=remote_server_tmp_dir, + auth_config=auth_config, + permissions_list=permissions_list, + ) ) assert None not in (server_store, server_url, registry_path) client_store = _create_remote_client_feature_store( temp_dir=remote_client_tmp_dir, server_registry_path=str(registry_path), feature_server_url=server_url, + auth_config=auth_config, ) assert client_store is not None _assert_non_existing_entity_feature_views_entity( @@ -127,11 +158,13 @@ def _assert_client_server_online_stores_are_matching( assert online_features_from_client == online_features_from_server -def _create_server_store_spin_feature_server(temp_dir): +def _create_server_store_spin_feature_server( + temp_dir, auth_config: str, permissions_list +): + store = default_store(str(temp_dir), auth_config, permissions_list) feast_server_port = free_port() - store = _default_store(str(temp_dir), "REMOTE_ONLINE_SERVER_PROJECT") server_url = next( - _start_feature_server( + start_feature_server( repo_path=str(store.repo_path), server_port=feast_server_port ) ) @@ -139,24 +172,8 @@ def _create_server_store_spin_feature_server(temp_dir): return store, server_url, os.path.join(store.repo_path, "data", "registry.db") -def _default_store(temp_dir, project_name) -> FeatureStore: - runner = CliRunner() - result = runner.run(["init", project_name], cwd=temp_dir) - repo_path = os.path.join(temp_dir, project_name, "feature_repo") - assert result.returncode == 0 - - result = runner.run(["--chdir", repo_path, "apply"], cwd=temp_dir) - assert result.returncode == 0 - - fs = FeatureStore(repo_path=repo_path) - fs.materialize_incremental( - end_date=_utc_now(), feature_views=["driver_hourly_stats"] - ) - return fs - - def _create_remote_client_feature_store( - temp_dir, server_registry_path: str, feature_server_url: str + temp_dir, server_registry_path: str, feature_server_url: str, auth_config: str ) -> FeatureStore: project_name = "REMOTE_ONLINE_CLIENT_PROJECT" runner = CliRunner() @@ -167,6 +184,7 @@ def _create_remote_client_feature_store( repo_path=str(repo_path), registry_path=server_registry_path, feature_server_url=feature_server_url, + auth_config=auth_config, ) result = runner.run(["--chdir", repo_path, "apply"], cwd=temp_dir) @@ -176,14 +194,14 @@ def _create_remote_client_feature_store( def _overwrite_remote_client_feature_store_yaml( - repo_path: str, registry_path: str, feature_server_url: str + repo_path: str, registry_path: str, feature_server_url: str, auth_config: str ): repo_config = os.path.join(repo_path, "feature_store.yaml") with open(repo_config, "w") as repo_config: repo_config.write( dedent( f""" - project: REMOTE_ONLINE_CLIENT_PROJECT + project: {PROJECT_NAME} registry: {registry_path} provider: local online_store: @@ -192,57 +210,5 @@ def _overwrite_remote_client_feature_store_yaml( entity_key_serialization_version: 2 """ ) - ) - - -def _start_feature_server(repo_path: str, server_port: int, metrics: bool = False): - host = "0.0.0.0" - cmd = [ - "feast", - "-c" + repo_path, - "serve", - "--host", - host, - "--port", - str(server_port), - ] - feast_server_process = subprocess.Popen( - cmd, stdout=subprocess.PIPE, stderr=subprocess.DEVNULL - ) - _time_out_sec: int = 60 - # Wait for server to start - wait_retry_backoff( - lambda: (None, check_port_open(host, server_port)), - timeout_secs=_time_out_sec, - timeout_msg=f"Unable to start the feast server in {_time_out_sec} seconds for remote online store type, port={server_port}", - ) - - if metrics: - cmd.append("--metrics") - - # Check if metrics are enabled and Prometheus server is running - if metrics: - wait_retry_backoff( - lambda: (None, check_port_open("localhost", 8000)), - timeout_secs=_time_out_sec, - timeout_msg="Unable to start the Prometheus server in 60 seconds.", - ) - else: - assert not check_port_open( - "localhost", 8000 - ), "Prometheus server is running when it should be disabled." - - yield f"http://localhost:{server_port}" - - if feast_server_process is not None: - feast_server_process.kill() - - # wait server to free the port - wait_retry_backoff( - lambda: ( - None, - not check_port_open("localhost", server_port), - ), - timeout_msg=f"Unable to stop the feast server in {_time_out_sec} seconds for remote online store type, port={server_port}", - timeout_secs=_time_out_sec, + + auth_config ) diff --git a/sdk/python/tests/integration/registration/test_universal_cli.py b/sdk/python/tests/integration/registration/test_universal_cli.py index fc90108d78..9e02ded4e4 100644 --- a/sdk/python/tests/integration/registration/test_universal_cli.py +++ b/sdk/python/tests/integration/registration/test_universal_cli.py @@ -61,6 +61,8 @@ def test_universal_cli(): assertpy.assert_that(result.returncode).is_equal_to(0) result = runner.run(["data-sources", "list"], cwd=repo_path) assertpy.assert_that(result.returncode).is_equal_to(0) + result = runner.run(["permissions", "list"], cwd=repo_path) + assertpy.assert_that(result.returncode).is_equal_to(0) # entity & feature view describe commands should succeed when objects exist result = runner.run(["entities", "describe", "driver"], cwd=repo_path) @@ -91,6 +93,8 @@ def test_universal_cli(): assertpy.assert_that(result.returncode).is_equal_to(1) result = runner.run(["data-sources", "describe", "foo"], cwd=repo_path) assertpy.assert_that(result.returncode).is_equal_to(1) + result = runner.run(["permissions", "describe", "foo"], cwd=repo_path) + assertpy.assert_that(result.returncode).is_equal_to(1) # Doing another apply should be a no op, and should not cause errors result = runner.run(["apply"], cwd=repo_path) diff --git a/sdk/python/tests/integration/registration/test_universal_registry.py b/sdk/python/tests/integration/registration/test_universal_registry.py index 9dcd1b5b91..c528cee4a8 100644 --- a/sdk/python/tests/integration/registration/test_universal_registry.py +++ b/sdk/python/tests/integration/registration/test_universal_registry.py @@ -40,6 +40,9 @@ from feast.infra.registry.remote import RemoteRegistry, RemoteRegistryConfig from feast.infra.registry.sql import SqlRegistry from feast.on_demand_feature_view import on_demand_feature_view +from feast.permissions.action import AuthzedAction +from feast.permissions.permission import Permission +from feast.permissions.policy import RoleBasedPolicy from feast.protos.feast.registry import RegistryServer_pb2, RegistryServer_pb2_grpc from feast.registry_server import RegistryServer from feast.repo_config import RegistryConfig @@ -270,7 +273,9 @@ def mock_remote_registry(): proxied_registry = Registry("project", registry_config, None) registry = RemoteRegistry( - registry_config=RemoteRegistryConfig(path=""), project=None, repo_path=None + registry_config=RemoteRegistryConfig(path=""), + project=None, + repo_path=None, ) mock_channel = GrpcMockChannel( RegistryServer_pb2.DESCRIPTOR.services_by_name["RegistryServer"], @@ -1154,7 +1159,9 @@ def simple_udf(x: int): assert stream_feature_views[0] == sfv test_registry.delete_feature_view("test kafka stream feature view", project) - stream_feature_views = test_registry.list_stream_feature_views(project) + stream_feature_views = test_registry.list_stream_feature_views( + project, tags=sfv.tags + ) assert len(stream_feature_views) == 0 test_registry.teardown() @@ -1343,3 +1350,138 @@ def validate_project_uuid(project_uuid, test_registry): assert len(test_registry.cached_registry_proto.project_metadata) == 1 project_metadata = test_registry.cached_registry_proto.project_metadata[0] assert project_metadata.project_uuid == project_uuid + + +@pytest.mark.integration +@pytest.mark.parametrize("test_registry", all_fixtures) +def test_apply_permission_success(test_registry): + permission = Permission( + name="read_permission", + actions=AuthzedAction.DESCRIBE, + policy=RoleBasedPolicy(roles=["reader"]), + types=FeatureView, + ) + + project = "project" + + # Register Permission + test_registry.apply_permission(permission, project) + project_metadata = test_registry.list_project_metadata(project=project) + assert len(project_metadata) == 1 + project_uuid = project_metadata[0].project_uuid + assert len(project_metadata[0].project_uuid) == 36 + assert_project_uuid(project, project_uuid, test_registry) + + permissions = test_registry.list_permissions(project) + assert_project_uuid(project, project_uuid, test_registry) + + permission = permissions[0] + assert ( + len(permissions) == 1 + and permission.name == "read_permission" + and len(permission.types) == 1 + and permission.types[0] == FeatureView + and len(permission.actions) == 1 + and permission.actions[0] == AuthzedAction.DESCRIBE + and isinstance(permission.policy, RoleBasedPolicy) + and len(permission.policy.roles) == 1 + and permission.policy.roles[0] == "reader" + and permission.name_pattern is None + and permission.tags is None + and permission.required_tags is None + ) + + # After the first apply, the created_timestamp should be the same as the last_update_timestamp. + assert permission.created_timestamp == permission.last_updated_timestamp + + permission = test_registry.get_permission("read_permission", project) + assert ( + permission.name == "read_permission" + and len(permission.types) == 1 + and permission.types[0] == FeatureView + and len(permission.actions) == 1 + and permission.actions[0] == AuthzedAction.DESCRIBE + and isinstance(permission.policy, RoleBasedPolicy) + and len(permission.policy.roles) == 1 + and permission.policy.roles[0] == "reader" + and permission.name_pattern is None + and permission.tags is None + and permission.required_tags is None + ) + + # Update permission + updated_permission = Permission( + name="read_permission", + actions=[AuthzedAction.DESCRIBE, AuthzedAction.WRITE_ONLINE], + policy=RoleBasedPolicy(roles=["reader", "writer"]), + types=FeatureView, + ) + test_registry.apply_permission(updated_permission, project) + + permissions = test_registry.list_permissions(project) + assert_project_uuid(project, project_uuid, test_registry) + assert len(permissions) == 1 + + updated_permission = test_registry.get_permission("read_permission", project) + assert ( + updated_permission.name == "read_permission" + and len(updated_permission.types) == 1 + and updated_permission.types[0] == FeatureView + and len(updated_permission.actions) == 2 + and AuthzedAction.DESCRIBE in updated_permission.actions + and AuthzedAction.WRITE_ONLINE in updated_permission.actions + and isinstance(updated_permission.policy, RoleBasedPolicy) + and len(updated_permission.policy.roles) == 2 + and "reader" in updated_permission.policy.roles + and "writer" in updated_permission.policy.roles + and updated_permission.name_pattern is None + and updated_permission.tags is None + and updated_permission.required_tags is None + ) + + # The created_timestamp for the entity should be set to the created_timestamp value stored from the previous apply + assert ( + updated_permission.created_timestamp is not None + and updated_permission.created_timestamp == permission.created_timestamp + ) + + updated_permission = Permission( + name="read_permission", + actions=[AuthzedAction.DESCRIBE, AuthzedAction.WRITE_ONLINE], + policy=RoleBasedPolicy(roles=["reader", "writer"]), + types=FeatureView, + name_pattern="aaa", + tags={"team": "matchmaking"}, + required_tags={"tag1": "tag1-value"}, + ) + test_registry.apply_permission(updated_permission, project) + + permissions = test_registry.list_permissions(project) + assert_project_uuid(project, project_uuid, test_registry) + assert len(permissions) == 1 + + updated_permission = test_registry.get_permission("read_permission", project) + assert ( + updated_permission.name == "read_permission" + and len(updated_permission.types) == 1 + and updated_permission.types[0] == FeatureView + and len(updated_permission.actions) == 2 + and AuthzedAction.DESCRIBE in updated_permission.actions + and AuthzedAction.WRITE_ONLINE in updated_permission.actions + and isinstance(updated_permission.policy, RoleBasedPolicy) + and len(updated_permission.policy.roles) == 2 + and "reader" in updated_permission.policy.roles + and "writer" in updated_permission.policy.roles + and updated_permission.name_pattern == "aaa" + and "team" in updated_permission.tags + and updated_permission.tags["team"] == "matchmaking" + and updated_permission.required_tags["tag1"] == "tag1-value" + ) + + test_registry.delete_permission("read_permission", project) + assert_project_uuid(project, project_uuid, test_registry) + permissions = test_registry.list_permissions(project) + assert_project_uuid(project, project_uuid, test_registry) + assert len(permissions) == 0 + + test_registry.teardown() diff --git a/sdk/python/tests/unit/diff/test_registry_diff.py b/sdk/python/tests/unit/diff/test_registry_diff.py index c209f1e0e0..2834c57800 100644 --- a/sdk/python/tests/unit/diff/test_registry_diff.py +++ b/sdk/python/tests/unit/diff/test_registry_diff.py @@ -6,8 +6,12 @@ tag_objects_for_keep_delete_update_add, ) from feast.entity import Entity +from feast.feast_object import ALL_RESOURCE_TYPES from feast.feature_view import FeatureView from feast.on_demand_feature_view import on_demand_feature_view +from feast.permissions.action import AuthzedAction +from feast.permissions.permission import Permission +from feast.permissions.policy import RoleBasedPolicy from feast.types import String from tests.utils.data_source_test_creator import prep_file_source @@ -170,3 +174,22 @@ def test_diff_registry_objects_batch_to_push_source(simple_dataset_1): feast_object_diffs.feast_object_property_diffs[0].property_name == "stream_source" ) + + +def test_diff_registry_objects_permissions(): + pre_changed = Permission( + name="reader", + types=ALL_RESOURCE_TYPES, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], + ) + post_changed = Permission( + name="reader", + types=ALL_RESOURCE_TYPES, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.CREATE], + ) + + feast_object_diffs = diff_registry_objects(pre_changed, post_changed, "permission") + assert len(feast_object_diffs.feast_object_property_diffs) == 1 + assert feast_object_diffs.feast_object_property_diffs[0].property_name == "actions" diff --git a/sdk/python/tests/unit/infra/scaffolding/test_repo_config.py b/sdk/python/tests/unit/infra/scaffolding/test_repo_config.py index 98d82ce357..0725d6d261 100644 --- a/sdk/python/tests/unit/infra/scaffolding/test_repo_config.py +++ b/sdk/python/tests/unit/infra/scaffolding/test_repo_config.py @@ -4,6 +4,12 @@ from typing import Optional from feast.infra.online_stores.sqlite import SqliteOnlineStoreConfig +from feast.permissions.auth.auth_type import AuthType +from feast.permissions.auth_model import ( + KubernetesAuthConfig, + NoAuthConfig, + OidcAuthConfig, +) from feast.repo_config import FeastConfigError, load_repo_config @@ -195,3 +201,119 @@ def test_no_provider(): ), expect_error=None, ) + + +def test_auth_config(): + _test_config( + dedent( + """ + project: foo + auth: + client_id: test_client_id + client_secret: test_client_secret + username: test_user_name + password: test_password + realm: master + auth_server_url: http://localhost:8712 + auth_discovery_url: http://localhost:8080/realms/master/.well-known/openid-configuration + registry: "registry.db" + provider: local + online_store: + path: foo + entity_key_serialization_version: 2 + """ + ), + expect_error="missing authentication type", + ) + + _test_config( + dedent( + """ + project: foo + auth: + type: not_valid_auth_type + client_id: test_client_id + client_secret: test_client_secret + username: test_user_name + password: test_password + realm: master + auth_server_url: http://localhost:8712 + auth_discovery_url: http://localhost:8080/realms/master/.well-known/openid-configuration + registry: "registry.db" + provider: local + online_store: + path: foo + entity_key_serialization_version: 2 + """ + ), + expect_error="invalid authentication type=not_valid_auth_type", + ) + + oidc_repo_config = _test_config( + dedent( + """ + project: foo + auth: + type: oidc + client_id: test_client_id + client_secret: test_client_secret + username: test_user_name + password: test_password + realm: master + auth_server_url: http://localhost:8080 + auth_discovery_url: http://localhost:8080/realms/master/.well-known/openid-configuration + registry: "registry.db" + provider: local + online_store: + path: foo + entity_key_serialization_version: 2 + """ + ), + expect_error=None, + ) + assert oidc_repo_config.auth["type"] == AuthType.OIDC.value + assert isinstance(oidc_repo_config.auth_config, OidcAuthConfig) + assert oidc_repo_config.auth_config.client_id == "test_client_id" + assert oidc_repo_config.auth_config.client_secret == "test_client_secret" + assert oidc_repo_config.auth_config.username == "test_user_name" + assert oidc_repo_config.auth_config.password == "test_password" + assert oidc_repo_config.auth_config.realm == "master" + assert oidc_repo_config.auth_config.auth_server_url == "http://localhost:8080" + assert ( + oidc_repo_config.auth_config.auth_discovery_url + == "http://localhost:8080/realms/master/.well-known/openid-configuration" + ) + + no_auth_repo_config = _test_config( + dedent( + """ + project: foo + registry: "registry.db" + provider: local + online_store: + path: foo + entity_key_serialization_version: 2 + """ + ), + expect_error=None, + ) + assert no_auth_repo_config.auth.get("type") == AuthType.NONE.value + assert isinstance(no_auth_repo_config.auth_config, NoAuthConfig) + + k8_repo_config = _test_config( + dedent( + """ + auth: + type: kubernetes + project: foo + registry: "registry.db" + provider: local + online_store: + path: foo + entity_key_serialization_version: 2 + """ + ), + expect_error=None, + ) + assert k8_repo_config.auth.get("type") == AuthType.KUBERNETES.value + assert isinstance(k8_repo_config.auth_config, KubernetesAuthConfig) diff --git a/sdk/python/tests/unit/local_feast_tests/test_local_feature_store.py b/sdk/python/tests/unit/local_feast_tests/test_local_feature_store.py index 0e834e314b..c86441d56c 100644 --- a/sdk/python/tests/unit/local_feast_tests/test_local_feature_store.py +++ b/sdk/python/tests/unit/local_feast_tests/test_local_feature_store.py @@ -9,11 +9,15 @@ from feast.data_format import AvroFormat, ParquetFormat from feast.data_source import KafkaSource from feast.entity import Entity +from feast.feast_object import ALL_RESOURCE_TYPES from feast.feature_store import FeatureStore from feast.feature_view import FeatureView from feast.field import Field from feast.infra.offline_stores.file_source import FileSource from feast.infra.online_stores.sqlite import SqliteOnlineStoreConfig +from feast.permissions.action import AuthzedAction +from feast.permissions.permission import Permission +from feast.permissions.policy import RoleBasedPolicy from feast.repo_config import RepoConfig from feast.stream_feature_view import stream_feature_view from feast.types import Array, Bytes, Float32, Int64, String @@ -338,6 +342,36 @@ def test_apply_entities_and_feature_views(test_feature_store): test_feature_store.teardown() +@pytest.mark.parametrize( + "test_feature_store", + [lazy_fixture("feature_store_with_local_registry")], +) +def test_apply_permissions(test_feature_store): + assert isinstance(test_feature_store, FeatureStore) + + permission = Permission( + name="reader", + types=ALL_RESOURCE_TYPES, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], + ) + + # Register Permission + test_feature_store.apply([permission]) + + permissions = test_feature_store.list_permissions() + assert len(permissions) == 1 + assert permissions[0] == permission + + # delete Permission + test_feature_store.apply(objects=[], objects_to_delete=[permission], partial=False) + + permissions = test_feature_store.list_permissions() + assert len(permissions) == 0 + + test_feature_store.teardown() + + @pytest.mark.parametrize( "test_feature_store", [lazy_fixture("feature_store_with_local_registry")], diff --git a/sdk/python/tests/unit/permissions/__init__.py b/sdk/python/tests/unit/permissions/__init__.py new file mode 100644 index 0000000000..e69de29bb2 diff --git a/sdk/python/tests/unit/permissions/auth/conftest.py b/sdk/python/tests/unit/permissions/auth/conftest.py new file mode 100644 index 0000000000..dc71aba23b --- /dev/null +++ b/sdk/python/tests/unit/permissions/auth/conftest.py @@ -0,0 +1,101 @@ +import pytest +from kubernetes import client + +from feast.permissions.auth_model import OidcAuthConfig +from tests.unit.permissions.auth.server.test_utils import ( + invalid_list_entities_perm, + read_entities_perm, + read_fv_perm, + read_odfv_perm, + read_permissions_perm, + read_sfv_perm, +) +from tests.unit.permissions.auth.test_token_parser import _CLIENT_ID + + +@pytest.fixture +def sa_name(): + return "my-name" + + +@pytest.fixture +def namespace(): + return "my-ns" + + +@pytest.fixture +def rolebindings(sa_name, namespace) -> dict: + roles = ["reader", "writer"] + items = [] + for r in roles: + items.append( + client.V1RoleBinding( + metadata=client.V1ObjectMeta(name=r, namespace=namespace), + subjects=[ + client.V1Subject( + kind="ServiceAccount", + name=sa_name, + api_group="rbac.authorization.k8s.io", + ) + ], + role_ref=client.V1RoleRef( + kind="Role", name=r, api_group="rbac.authorization.k8s.io" + ), + ) + ) + return {"items": client.V1RoleBindingList(items=items), "roles": roles} + + +@pytest.fixture +def clusterrolebindings(sa_name, namespace) -> dict: + roles = ["updater"] + items = [] + for r in roles: + items.append( + client.V1ClusterRoleBinding( + metadata=client.V1ObjectMeta(name=r, namespace=namespace), + subjects=[ + client.V1Subject( + kind="ServiceAccount", + name=sa_name, + namespace=namespace, + api_group="rbac.authorization.k8s.io", + ) + ], + role_ref=client.V1RoleRef( + kind="Role", name=r, api_group="rbac.authorization.k8s.io" + ), + ) + ) + return {"items": client.V1RoleBindingList(items=items), "roles": roles} + + +@pytest.fixture +def oidc_config() -> OidcAuthConfig: + return OidcAuthConfig( + auth_server_url="", + auth_discovery_url="", + client_id=_CLIENT_ID, + client_secret="", + username="", + password="", + realm="", + ) + + +@pytest.fixture( + scope="module", + params=[ + [], + [invalid_list_entities_perm], + [ + read_entities_perm, + read_permissions_perm, + read_fv_perm, + read_odfv_perm, + read_sfv_perm, + ], + ], +) +def applied_permissions(request): + return request.param diff --git a/sdk/python/tests/unit/permissions/auth/server/mock_utils.py b/sdk/python/tests/unit/permissions/auth/server/mock_utils.py new file mode 100644 index 0000000000..8f598774ee --- /dev/null +++ b/sdk/python/tests/unit/permissions/auth/server/mock_utils.py @@ -0,0 +1,72 @@ +from unittest.mock import MagicMock, Mock + +from requests import Response + + +def mock_oidc(request, monkeypatch, client_id): + async def mock_oath2(self, request): + return "OK" + + monkeypatch.setattr( + "feast.permissions.auth.oidc_token_parser.OAuth2AuthorizationCodeBearer.__call__", + mock_oath2, + ) + signing_key = MagicMock() + signing_key.key = "a-key" + monkeypatch.setattr( + "feast.permissions.auth.oidc_token_parser.PyJWKClient.get_signing_key_from_jwt", + lambda self, access_token: signing_key, + ) + user_data = { + "preferred_username": "my-name", + "resource_access": {client_id: {"roles": ["reader", "writer"]}}, + } + monkeypatch.setattr( + "feast.permissions.auth.oidc_token_parser.jwt.decode", + lambda self, *args, **kwargs: user_data, + ) + discovery_response = Mock(spec=Response) + discovery_response.status_code = 200 + discovery_response.json.return_value = { + "token_endpoint": "http://localhost:8080/realms/master/protocol/openid-connect/token" + } + monkeypatch.setattr( + "feast.permissions.client.oidc_authentication_client_manager.requests.get", + lambda url: discovery_response, + ) + token_response = Mock(spec=Response) + token_response.status_code = 200 + token_response.json.return_value = {"access_token": "my-token"} + monkeypatch.setattr( + "feast.permissions.client.oidc_authentication_client_manager.requests.post", + lambda url, data, headers: token_response, + ) + + +def mock_kubernetes(request, monkeypatch): + sa_name = request.getfixturevalue("sa_name") + namespace = request.getfixturevalue("namespace") + subject = f"system:serviceaccount:{namespace}:{sa_name}" + rolebindings = request.getfixturevalue("rolebindings") + clusterrolebindings = request.getfixturevalue("clusterrolebindings") + + monkeypatch.setattr( + "feast.permissions.auth.kubernetes_token_parser.config.load_incluster_config", + lambda: None, + ) + monkeypatch.setattr( + "feast.permissions.auth.kubernetes_token_parser.jwt.decode", + lambda *args, **kwargs: {"sub": subject}, + ) + monkeypatch.setattr( + "feast.permissions.auth.kubernetes_token_parser.client.RbacAuthorizationV1Api.list_namespaced_role_binding", + lambda *args, **kwargs: rolebindings["items"], + ) + monkeypatch.setattr( + "feast.permissions.auth.kubernetes_token_parser.client.RbacAuthorizationV1Api.list_cluster_role_binding", + lambda *args, **kwargs: clusterrolebindings["items"], + ) + monkeypatch.setattr( + "feast.permissions.client.kubernetes_auth_client_manager.KubernetesAuthClientManager.get_token", + lambda self: "my-token", + ) diff --git a/sdk/python/tests/unit/permissions/auth/server/test_auth_registry_server.py b/sdk/python/tests/unit/permissions/auth/server/test_auth_registry_server.py new file mode 100644 index 0000000000..bc16bdac3b --- /dev/null +++ b/sdk/python/tests/unit/permissions/auth/server/test_auth_registry_server.py @@ -0,0 +1,239 @@ +from datetime import datetime + +import assertpy +import pandas as pd +import pytest +import yaml + +from feast import ( + FeatureStore, +) +from feast.permissions.permission import Permission +from feast.registry_server import start_server +from feast.wait import wait_retry_backoff # noqa: E402 +from tests.unit.permissions.auth.server import mock_utils +from tests.unit.permissions.auth.server.test_utils import ( + invalid_list_entities_perm, + read_entities_perm, + read_fv_perm, + read_odfv_perm, + read_permissions_perm, + read_sfv_perm, +) +from tests.utils.auth_permissions_util import get_remote_registry_store +from tests.utils.http_server import check_port_open # noqa: E402 + + +@pytest.fixture +def start_registry_server( + request, + auth_config, + server_port, + feature_store, + monkeypatch, +): + if "kubernetes" in auth_config: + mock_utils.mock_kubernetes(request=request, monkeypatch=monkeypatch) + elif "oidc" in auth_config: + auth_config_yaml = yaml.safe_load(auth_config) + mock_utils.mock_oidc( + request=request, + monkeypatch=monkeypatch, + client_id=auth_config_yaml["auth"]["client_id"], + ) + + assertpy.assert_that(server_port).is_not_equal_to(0) + + print(f"Starting Registry at {server_port}") + server = start_server(feature_store, server_port, wait_for_termination=False) + print("Waiting server availability") + wait_retry_backoff( + lambda: (None, check_port_open("localhost", server_port)), + timeout_secs=10, + ) + print("Server started") + + yield server + + print("Stopping server") + server.stop(grace=None) # Teardown server + + +def test_registry_apis( + auth_config, + temp_dir, + server_port, + start_registry_server, + feature_store, + applied_permissions, +): + print(f"Running for\n:{auth_config}") + remote_feature_store = get_remote_registry_store(server_port, feature_store) + permissions = _test_list_permissions(remote_feature_store, applied_permissions) + _test_list_entities(remote_feature_store, applied_permissions) + _test_list_fvs(remote_feature_store, applied_permissions) + + if _permissions_exist_in_permission_list( + [ + read_entities_perm, + read_permissions_perm, + read_fv_perm, + read_odfv_perm, + read_sfv_perm, + ], + permissions, + ): + _test_get_historical_features(remote_feature_store) + + +def _test_get_historical_features(client_fs: FeatureStore): + entity_df = pd.DataFrame.from_dict( + { + # entity's join key -> entity values + "driver_id": [1001, 1002, 1003], + # "event_timestamp" (reserved key) -> timestamps + "event_timestamp": [ + datetime(2021, 4, 12, 10, 59, 42), + datetime(2021, 4, 12, 8, 12, 10), + datetime(2021, 4, 12, 16, 40, 26), + ], + # (optional) label name -> label values. Feast does not process these + "label_driver_reported_satisfaction": [1, 5, 3], + # values we're using for an on-demand transformation + "val_to_add": [1, 2, 3], + "val_to_add_2": [10, 20, 30], + } + ) + + training_df = client_fs.get_historical_features( + entity_df=entity_df, + features=[ + "driver_hourly_stats:conv_rate", + "driver_hourly_stats:acc_rate", + "driver_hourly_stats:avg_daily_trips", + "transformed_conv_rate:conv_rate_plus_val1", + "transformed_conv_rate:conv_rate_plus_val2", + ], + ).to_df() + assertpy.assert_that(training_df).is_not_none() + + +def _test_list_entities(client_fs: FeatureStore, permissions: list[Permission]): + entities = client_fs.list_entities() + + if not _is_auth_enabled(client_fs) or _is_permission_enabled( + client_fs, permissions, read_entities_perm + ): + assertpy.assert_that(entities).is_not_none() + assertpy.assert_that(len(entities)).is_equal_to(1) + assertpy.assert_that(entities[0].name).is_equal_to("driver") + else: + assertpy.assert_that(entities).is_not_none() + assertpy.assert_that(len(entities)).is_equal_to(0) + + +def _no_permission_retrieved(permissions: list[Permission]) -> bool: + return len(permissions) == 0 + + +def _test_list_permissions( + client_fs: FeatureStore, applied_permissions: list[Permission] +) -> list[Permission]: + if _is_auth_enabled(client_fs) and _permissions_exist_in_permission_list( + [invalid_list_entities_perm], applied_permissions + ): + with pytest.raises(Exception): + client_fs.list_permissions() + return [] + else: + permissions = client_fs.list_permissions() + + if not _is_auth_enabled(client_fs): + assertpy.assert_that(permissions).is_not_none() + assertpy.assert_that(len(permissions)).is_equal_to(len(applied_permissions)) + elif _is_auth_enabled(client_fs) and _permissions_exist_in_permission_list( + [ + read_entities_perm, + read_permissions_perm, + read_fv_perm, + read_odfv_perm, + read_sfv_perm, + ], + permissions, + ): + assertpy.assert_that(permissions).is_not_none() + assertpy.assert_that(len(permissions)).is_equal_to( + len( + [ + read_entities_perm, + read_permissions_perm, + read_fv_perm, + read_odfv_perm, + read_sfv_perm, + ] + ) + ) + elif _is_auth_enabled(client_fs) and _is_listing_permissions_allowed(permissions): + assertpy.assert_that(permissions).is_not_none() + assertpy.assert_that(len(permissions)).is_equal_to(1) + + return permissions + + +def _is_listing_permissions_allowed(permissions: list[Permission]) -> bool: + return read_permissions_perm in permissions + + +def _is_auth_enabled(client_fs: FeatureStore) -> bool: + return client_fs.config.auth_config.type != "no_auth" + + +def _test_list_fvs(client_fs: FeatureStore, permissions: list[Permission]): + if _is_auth_enabled(client_fs) and _permissions_exist_in_permission_list( + [invalid_list_entities_perm], permissions + ): + with pytest.raises(Exception): + client_fs.list_feature_views() + return [] + else: + fvs = client_fs.list_feature_views() + for fv in fvs: + print(f"{fv.name}, {type(fv).__name__}") + + if not _is_auth_enabled(client_fs) or _is_permission_enabled( + client_fs, permissions, read_fv_perm + ): + assertpy.assert_that(fvs).is_not_none() + assertpy.assert_that(len(fvs)).is_equal_to(2) + + names = _to_names(fvs) + assertpy.assert_that(names).contains("driver_hourly_stats") + assertpy.assert_that(names).contains("driver_hourly_stats_fresh") + else: + assertpy.assert_that(fvs).is_not_none() + assertpy.assert_that(len(fvs)).is_equal_to(0) + + +def _permissions_exist_in_permission_list( + permission_to_test: list[Permission], permission_list: list[Permission] +) -> bool: + return all(e in permission_list for e in permission_to_test) + + +def _is_permission_enabled( + client_fs: FeatureStore, + permissions: list[Permission], + permission: Permission, +): + return _is_auth_enabled(client_fs) and ( + _no_permission_retrieved(permissions) + or ( + _permissions_exist_in_permission_list( + [read_permissions_perm, permission], permissions + ) + ) + ) + + +def _to_names(items): + return [i.name for i in items] diff --git a/sdk/python/tests/unit/permissions/auth/server/test_utils.py b/sdk/python/tests/unit/permissions/auth/server/test_utils.py new file mode 100644 index 0000000000..5d781919a0 --- /dev/null +++ b/sdk/python/tests/unit/permissions/auth/server/test_utils.py @@ -0,0 +1,61 @@ +import assertpy +import pytest + +from feast import Entity, FeatureView, OnDemandFeatureView, StreamFeatureView +from feast.permissions.action import AuthzedAction +from feast.permissions.permission import Permission +from feast.permissions.policy import RoleBasedPolicy +from feast.permissions.server.utils import AuthManagerType, str_to_auth_manager_type + +read_permissions_perm = Permission( + name="read_permissions_perm", + types=Permission, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], +) + +read_entities_perm = Permission( + name="read_entities_perm", + types=Entity, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], +) + +read_fv_perm = Permission( + name="read_fv_perm", + types=FeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], +) + +read_odfv_perm = Permission( + name="read_odfv_perm", + types=OnDemandFeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], +) + +read_sfv_perm = Permission( + name="read_sfv_perm", + types=StreamFeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], +) + +invalid_list_entities_perm = Permission( + name="invalid_list_entity_perm", + types=Entity, + policy=RoleBasedPolicy(roles=["dancer"]), + actions=[AuthzedAction.DESCRIBE], +) + + +@pytest.mark.parametrize( + "label, value", + [(t.value, t) for t in AuthManagerType] + + [(t.value.upper(), t) for t in AuthManagerType] + + [(t.value.lower(), t) for t in AuthManagerType] + + [("none", AuthManagerType.NONE)], +) +def test_str_to_auth_type(label, value): + assertpy.assert_that(str_to_auth_manager_type(label)).is_equal_to(value) diff --git a/sdk/python/tests/unit/permissions/auth/test_token_extractor.py b/sdk/python/tests/unit/permissions/auth/test_token_extractor.py new file mode 100644 index 0000000000..a6fcd89e5b --- /dev/null +++ b/sdk/python/tests/unit/permissions/auth/test_token_extractor.py @@ -0,0 +1,140 @@ +from unittest.mock import Mock + +import assertpy +import pytest +from fastapi.requests import Request +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.server.arrow_flight_token_extractor import ( + ArrowFlightTokenExtractor, +) +from feast.permissions.server.grpc_token_extractor import GrpcTokenExtractor +from feast.permissions.server.rest_token_extractor import RestTokenExtractor + + +@pytest.mark.parametrize( + "error_type, dict, header", + [ + (ValueError, {}, None), + (ValueError, {"other": 123}, None), + (AuthenticationError, {}, ""), + (AuthenticationError, {}, "abcd"), + (AuthenticationError, {}, "other-scheme abcd"), + ], +) +def test_rest_token_extractor_failures(error_type, dict, header): + token_extractor = RestTokenExtractor() + + request = None + if header is not None: + request = Mock(spec=Request) + if header != "": + request.headers = {"authorization": header} + else: + request.headers = {} + with pytest.raises(error_type): + if request is None: + token_extractor.extract_access_token(**dict) + else: + token_extractor.extract_access_token(request=request) + + +@pytest.mark.parametrize( + "error_type, dict, header", + [ + (ValueError, {}, None), + (ValueError, {"other": 123}, None), + (AuthenticationError, {}, ""), + (AuthenticationError, {}, "abcd"), + (AuthenticationError, {}, "other-scheme abcd"), + ], +) +def test_grpc_token_extractor_failures(error_type, dict, header): + token_extractor = GrpcTokenExtractor() + + metadata = None + if header is not None: + metadata = {} + if metadata != "": + metadata["authorization"] = header + with pytest.raises(error_type): + if metadata is None: + token_extractor.extract_access_token(**dict) + else: + token_extractor.extract_access_token(metadata=metadata) + + +def test_rest_token_extractor(): + token_extractor = RestTokenExtractor() + request: Request = Mock(spec=Request) + token = "abcd" + + request.headers = {"authorization": f"Bearer {token}"} + assertpy.assert_that( + token_extractor.extract_access_token(request=request) + ).is_equal_to(token) + + request.headers = {"authorization": f"bearer {token}"} + assertpy.assert_that( + token_extractor.extract_access_token(request=request) + ).is_equal_to(token) + + +def test_grpc_token_extractor(): + token_extractor = GrpcTokenExtractor() + metadata = {} + token = "abcd" + + metadata["authorization"] = f"Bearer {token}" + assertpy.assert_that( + token_extractor.extract_access_token(metadata=metadata) + ).is_equal_to(token) + + metadata["authorization"] = f"bearer {token}" + assertpy.assert_that( + token_extractor.extract_access_token(metadata=metadata) + ).is_equal_to(token) + + +@pytest.mark.parametrize( + "error_type, dict, header", + [ + (ValueError, {}, None), + (ValueError, {"other": 123}, None), + (AuthenticationError, {}, ""), + (AuthenticationError, {}, "abcd"), + (AuthenticationError, {}, ["abcd"]), + (AuthenticationError, {}, ["other-scheme abcd"]), + ], +) +def test_arrow_flight_token_extractor_failures(error_type, dict, header): + token_extractor = ArrowFlightTokenExtractor() + + headers = None + if header is not None: + if header != "": + headers = {"authorization": header} + else: + headers = {} + with pytest.raises(error_type): + if headers is None: + token_extractor.extract_access_token(**dict) + else: + token_extractor.extract_access_token(headers=headers) + + +def test_arrow_flight_token_extractor(): + token_extractor = ArrowFlightTokenExtractor() + token = "abcd" + + headers = {"authorization": [f"Bearer {token}"]} + assertpy.assert_that( + token_extractor.extract_access_token(headers=headers) + ).is_equal_to(token) + + headers = {"authorization": [f"bearer {token}"]} + assertpy.assert_that( + token_extractor.extract_access_token(headers=headers) + ).is_equal_to(token) diff --git a/sdk/python/tests/unit/permissions/auth/test_token_parser.py b/sdk/python/tests/unit/permissions/auth/test_token_parser.py new file mode 100644 index 0000000000..6ae9094f81 --- /dev/null +++ b/sdk/python/tests/unit/permissions/auth/test_token_parser.py @@ -0,0 +1,122 @@ +# test_token_validator.py + +import asyncio +from unittest.mock import MagicMock, patch + +import assertpy +import pytest +from starlette.authentication import ( + AuthenticationError, +) + +from feast.permissions.auth.kubernetes_token_parser import KubernetesTokenParser +from feast.permissions.auth.oidc_token_parser import OidcTokenParser +from feast.permissions.user import User + +_CLIENT_ID = "test" + + +@patch( + "feast.permissions.auth.oidc_token_parser.OAuth2AuthorizationCodeBearer.__call__" +) +@patch("feast.permissions.auth.oidc_token_parser.PyJWKClient.get_signing_key_from_jwt") +@patch("feast.permissions.auth.oidc_token_parser.jwt.decode") +def test_oidc_token_validation_success( + mock_jwt, mock_signing_key, mock_oauth2, oidc_config +): + signing_key = MagicMock() + signing_key.key = "a-key" + mock_signing_key.return_value = signing_key + + user_data = { + "preferred_username": "my-name", + "resource_access": {_CLIENT_ID: {"roles": ["reader", "writer"]}}, + } + mock_jwt.return_value = user_data + + access_token = "aaa-bbb-ccc" + token_parser = OidcTokenParser(auth_config=oidc_config) + user = asyncio.run( + token_parser.user_details_from_access_token(access_token=access_token) + ) + + assertpy.assert_that(user).is_type_of(User) + if isinstance(user, User): + assertpy.assert_that(user.username).is_equal_to("my-name") + assertpy.assert_that(user.roles.sort()).is_equal_to(["reader", "writer"].sort()) + assertpy.assert_that(user.has_matching_role(["reader"])).is_true() + assertpy.assert_that(user.has_matching_role(["writer"])).is_true() + assertpy.assert_that(user.has_matching_role(["updater"])).is_false() + + +@patch( + "feast.permissions.auth.oidc_token_parser.OAuth2AuthorizationCodeBearer.__call__" +) +def test_oidc_token_validation_failure(mock_oauth2, oidc_config): + mock_oauth2.side_effect = AuthenticationError("wrong token") + + access_token = "aaa-bbb-ccc" + token_parser = OidcTokenParser(auth_config=oidc_config) + with pytest.raises(AuthenticationError): + asyncio.run( + token_parser.user_details_from_access_token(access_token=access_token) + ) + + +# TODO RBAC: Move role bindings to a reusable fixture +@patch("feast.permissions.auth.kubernetes_token_parser.config.load_incluster_config") +@patch("feast.permissions.auth.kubernetes_token_parser.jwt.decode") +@patch( + "feast.permissions.auth.kubernetes_token_parser.client.RbacAuthorizationV1Api.list_namespaced_role_binding" +) +@patch( + "feast.permissions.auth.kubernetes_token_parser.client.RbacAuthorizationV1Api.list_cluster_role_binding" +) +def test_k8s_token_validation_success( + mock_crb, + mock_rb, + mock_jwt, + mock_config, + rolebindings, + clusterrolebindings, +): + sa_name = "my-name" + namespace = "my-ns" + subject = f"system:serviceaccount:{namespace}:{sa_name}" + mock_jwt.return_value = {"sub": subject} + + mock_rb.return_value = rolebindings["items"] + mock_crb.return_value = clusterrolebindings["items"] + + roles = rolebindings["roles"] + croles = clusterrolebindings["roles"] + + access_token = "aaa-bbb-ccc" + token_parser = KubernetesTokenParser() + user = asyncio.run( + token_parser.user_details_from_access_token(access_token=access_token) + ) + + assertpy.assert_that(user).is_type_of(User) + if isinstance(user, User): + assertpy.assert_that(user.username).is_equal_to(f"{namespace}:{sa_name}") + assertpy.assert_that(user.roles.sort()).is_equal_to((roles + croles).sort()) + for r in roles: + assertpy.assert_that(user.has_matching_role([r])).is_true() + for cr in croles: + assertpy.assert_that(user.has_matching_role([cr])).is_true() + assertpy.assert_that(user.has_matching_role(["foo"])).is_false() + + +@patch("feast.permissions.auth.kubernetes_token_parser.config.load_incluster_config") +@patch("feast.permissions.auth.kubernetes_token_parser.jwt.decode") +def test_k8s_token_validation_failure(mock_jwt, mock_config): + subject = "wrong-subject" + mock_jwt.return_value = {"sub": subject} + + access_token = "aaa-bbb-ccc" + token_parser = KubernetesTokenParser() + with pytest.raises(AuthenticationError): + asyncio.run( + token_parser.user_details_from_access_token(access_token=access_token) + ) diff --git a/sdk/python/tests/unit/permissions/conftest.py b/sdk/python/tests/unit/permissions/conftest.py new file mode 100644 index 0000000000..7cd944fb47 --- /dev/null +++ b/sdk/python/tests/unit/permissions/conftest.py @@ -0,0 +1,88 @@ +from unittest.mock import Mock + +import pytest + +from feast import FeatureView +from feast.infra.registry.base_registry import BaseRegistry +from feast.permissions.decorator import require_permissions +from feast.permissions.permission import AuthzedAction, Permission +from feast.permissions.policy import RoleBasedPolicy +from feast.permissions.security_manager import ( + SecurityManager, + set_security_manager, +) +from feast.permissions.user import User + + +class SecuredFeatureView(FeatureView): + def __init__(self, name, tags): + super().__init__( + name=name, + source=Mock(), + tags=tags, + ) + + @require_permissions(actions=[AuthzedAction.DESCRIBE]) + def read_protected(self) -> bool: + return True + + @require_permissions(actions=[AuthzedAction.UPDATE]) + def write_protected(self) -> bool: + return True + + def unprotected(self) -> bool: + return True + + +@pytest.fixture +def feature_views() -> list[FeatureView]: + return [ + SecuredFeatureView("secured", {}), + SecuredFeatureView("special-secured", {}), + ] + + +@pytest.fixture +def users() -> list[User]: + users = [] + users.append(User("r", ["reader"])) + users.append(User("w", ["writer"])) + users.append(User("rw", ["reader", "writer"])) + users.append(User("admin", ["reader", "writer", "admin"])) + return dict([(u.username, u) for u in users]) + + +@pytest.fixture +def security_manager() -> SecurityManager: + permissions = [] + permissions.append( + Permission( + name="reader", + types=FeatureView, + policy=RoleBasedPolicy(roles=["reader"]), + actions=[AuthzedAction.DESCRIBE], + ) + ) + permissions.append( + Permission( + name="writer", + types=FeatureView, + policy=RoleBasedPolicy(roles=["writer"]), + actions=[AuthzedAction.UPDATE], + ) + ) + permissions.append( + Permission( + name="special", + types=FeatureView, + name_pattern="special.*", + policy=RoleBasedPolicy(roles=["admin", "special-reader"]), + actions=[AuthzedAction.DESCRIBE, AuthzedAction.UPDATE], + ) + ) + + registry = Mock(spec=BaseRegistry) + registry.list_permissions = Mock(return_value=permissions) + sm = SecurityManager(project="any", registry=registry) + set_security_manager(sm) + return sm diff --git a/sdk/python/tests/unit/permissions/test_decision.py b/sdk/python/tests/unit/permissions/test_decision.py new file mode 100644 index 0000000000..23bafedeab --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_decision.py @@ -0,0 +1,34 @@ +import assertpy +import pytest + +from feast.permissions.decision import DecisionEvaluator + +# Each vote is a tuple of `current_vote` and expected output of `is_decided` + + +@pytest.mark.parametrize( + "evaluator, votes, decision, no_of_explanations", + [ + (DecisionEvaluator(3), [(True, True)], True, 0), + (DecisionEvaluator(3), [(True, True)], True, 0), + ( + DecisionEvaluator(3), + [(False, False), (False, False), (False, True)], + False, + 3, + ), + ], +) +def test_decision_evaluator(evaluator, votes, decision, no_of_explanations): + for v in votes: + vote = v[0] + decided = v[1] + evaluator.add_grant(vote, "" if vote else "a message") + if decided: + assertpy.assert_that(evaluator.is_decided()).is_true() + else: + assertpy.assert_that(evaluator.is_decided()).is_false() + + grant, explanations = evaluator.grant() + assertpy.assert_that(grant).is_equal_to(decision) + assertpy.assert_that(explanations).is_length(no_of_explanations) diff --git a/sdk/python/tests/unit/permissions/test_decorator.py b/sdk/python/tests/unit/permissions/test_decorator.py new file mode 100644 index 0000000000..8f6c2c420b --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_decorator.py @@ -0,0 +1,32 @@ +import assertpy +import pytest + + +@pytest.mark.parametrize( + "username, can_read, can_write", + [ + (None, False, False), + ("r", True, False), + ("w", False, True), + ("rw", True, True), + ], +) +def test_access_SecuredFeatureView( + security_manager, feature_views, users, username, can_read, can_write +): + sm = security_manager + fv = feature_views[0] + user = users.get(username) + + sm.set_current_user(user) + if can_read: + fv.read_protected() + else: + with pytest.raises(PermissionError): + fv.read_protected() + if can_write: + fv.write_protected() + else: + with pytest.raises(PermissionError): + fv.write_protected() + assertpy.assert_that(fv.unprotected()).is_true() diff --git a/sdk/python/tests/unit/permissions/test_oidc_auth_client.py b/sdk/python/tests/unit/permissions/test_oidc_auth_client.py new file mode 100644 index 0000000000..22ed5b6f87 --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_oidc_auth_client.py @@ -0,0 +1,62 @@ +from unittest.mock import patch + +from requests import Session + +from feast.permissions.auth_model import ( + KubernetesAuthConfig, + NoAuthConfig, + OidcAuthConfig, +) +from feast.permissions.client.http_auth_requests_wrapper import ( + AuthenticatedRequestsSession, + get_http_auth_requests_session, +) +from feast.permissions.client.kubernetes_auth_client_manager import ( + KubernetesAuthClientManager, +) +from feast.permissions.client.oidc_authentication_client_manager import ( + OidcAuthClientManager, +) + +MOCKED_TOKEN_VALUE: str = "dummy_token" + + +def _get_dummy_oidc_auth_type() -> OidcAuthConfig: + oidc_config = OidcAuthConfig( + auth_discovery_url="http://localhost:8080/realms/master/.well-known/openid-configuration", + type="oidc", + username="admin_test", + password="password_test", + client_id="dummy_client_id", + ) + return oidc_config + + +@patch.object(KubernetesAuthClientManager, "get_token", return_value=MOCKED_TOKEN_VALUE) +@patch.object(OidcAuthClientManager, "get_token", return_value=MOCKED_TOKEN_VALUE) +def test_http_auth_requests_session(mock_kubernetes_token, mock_oidc_token): + no_auth_config = NoAuthConfig() + assert isinstance(get_http_auth_requests_session(no_auth_config), Session) + + oidc_auth_config = _get_dummy_oidc_auth_type() + oidc_auth_requests_session = get_http_auth_requests_session(oidc_auth_config) + _assert_auth_requests_session(oidc_auth_requests_session, MOCKED_TOKEN_VALUE) + + kubernetes_auth_config = KubernetesAuthConfig(type="kubernetes") + kubernetes_auth_requests_session = get_http_auth_requests_session( + kubernetes_auth_config + ) + _assert_auth_requests_session(kubernetes_auth_requests_session, MOCKED_TOKEN_VALUE) + + +def _assert_auth_requests_session( + auth_req_session: AuthenticatedRequestsSession, expected_token: str +): + assert isinstance(auth_req_session, AuthenticatedRequestsSession) + assert "Authorization" in auth_req_session.headers, ( + "Authorization header is missing in object of class: " + "AuthenticatedRequestsSession " + ) + assert ( + auth_req_session.headers["Authorization"] == f"Bearer {expected_token}" + ), "Authorization token is incorrect" diff --git a/sdk/python/tests/unit/permissions/test_permission.py b/sdk/python/tests/unit/permissions/test_permission.py new file mode 100644 index 0000000000..606d750d81 --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_permission.py @@ -0,0 +1,205 @@ +from unittest.mock import Mock + +import assertpy +import pytest + +from feast.batch_feature_view import BatchFeatureView +from feast.data_source import DataSource +from feast.entity import Entity +from feast.feast_object import ALL_RESOURCE_TYPES +from feast.feature_service import FeatureService +from feast.feature_view import FeatureView +from feast.on_demand_feature_view import OnDemandFeatureView +from feast.permissions.action import ALL_ACTIONS, AuthzedAction +from feast.permissions.permission import ( + Permission, +) +from feast.permissions.policy import AllowAll, Policy +from feast.saved_dataset import ValidationReference +from feast.stream_feature_view import StreamFeatureView + + +def test_defaults(): + p = Permission(name="test") + assertpy.assert_that(type(p.types)).is_equal_to(list) + assertpy.assert_that(p.types).is_equal_to(ALL_RESOURCE_TYPES) + assertpy.assert_that(p.name_pattern).is_none() + assertpy.assert_that(p.tags).is_none() + assertpy.assert_that(type(p.actions)).is_equal_to(list) + assertpy.assert_that(p.actions).is_equal_to(ALL_ACTIONS) + assertpy.assert_that(type(p.actions)).is_equal_to(list) + assertpy.assert_that(isinstance(p.policy, Policy)).is_true() + assertpy.assert_that(p.policy).is_equal_to(AllowAll) + + +@pytest.mark.parametrize( + "dict, result", + [ + ({"types": None}, True), + ({"types": []}, True), + ({"types": ALL_RESOURCE_TYPES}, True), + ({"types": [FeatureView, FeatureService]}, True), + ({"actions": None}, False), + ({"actions": []}, False), + ({"actions": ALL_ACTIONS}, True), + ({"actions": ALL_ACTIONS}, True), + ({"actions": [AuthzedAction.CREATE, AuthzedAction.DELETE]}, True), + ({"policy": None}, False), + ({"policy": []}, False), + ({"policy": Mock(spec=Policy)}, True), + ], +) +def test_validity(dict, result): + if not result: + with pytest.raises(ValueError): + Permission(name="test", **dict) + else: + Permission(name="test", **dict) + + +def test_normalized_args(): + p = Permission(name="test") + assertpy.assert_that(type(p.types)).is_equal_to(list) + assertpy.assert_that(p.types).is_equal_to(ALL_RESOURCE_TYPES) + + p = Permission(name="test", actions=AuthzedAction.CREATE) + assertpy.assert_that(type(p.actions)).is_equal_to(list) + assertpy.assert_that(p.actions).is_equal_to([AuthzedAction.CREATE]) + + +@pytest.mark.parametrize( + "resource, types, result", + [ + (None, ALL_RESOURCE_TYPES, False), + ("invalid string", ALL_RESOURCE_TYPES, False), + ("ALL", ALL_RESOURCE_TYPES, False), + ("ALL", ALL_RESOURCE_TYPES, False), + ( + Mock(spec=FeatureView), + [t for t in ALL_RESOURCE_TYPES if t not in [FeatureView]], + False, + ), + ( + Mock(spec=OnDemandFeatureView), + [t for t in ALL_RESOURCE_TYPES if t not in [OnDemandFeatureView]], + False, + ), # OnDemandFeatureView is a BaseFeatureView + ( + Mock(spec=BatchFeatureView), + FeatureView, + True, + ), # BatchFeatureView is a FeatureView + ( + Mock(spec=BatchFeatureView), + [t for t in ALL_RESOURCE_TYPES if t not in [FeatureView, BatchFeatureView]], + False, + ), + ( + Mock(spec=StreamFeatureView), + FeatureView, + True, + ), # StreamFeatureView is a FeatureView + ( + Mock(spec=StreamFeatureView), + [ + t + for t in ALL_RESOURCE_TYPES + if t not in [FeatureView, StreamFeatureView] + ], + False, + ), + ( + Mock(spec=Entity), + [t for t in ALL_RESOURCE_TYPES if t not in [Entity]], + False, + ), + ( + Mock(spec=FeatureService), + [t for t in ALL_RESOURCE_TYPES if t not in [FeatureService]], + False, + ), + ( + Mock(spec=DataSource), + [t for t in ALL_RESOURCE_TYPES if t not in [DataSource]], + False, + ), + ( + Mock(spec=ValidationReference), + [t for t in ALL_RESOURCE_TYPES if t not in [ValidationReference]], + False, + ), + ( + Mock(spec=Permission), + [t for t in ALL_RESOURCE_TYPES if t not in [Permission]], + False, + ), + ] + + [(Mock(spec=t), ALL_RESOURCE_TYPES, True) for t in ALL_RESOURCE_TYPES] + + [(Mock(spec=t), [t], True) for t in ALL_RESOURCE_TYPES], +) +def test_match_resource_with_subclasses(resource, types, result): + p = Permission(name="test", types=types) + assertpy.assert_that(p.match_resource(resource)).is_equal_to(result) + + +@pytest.mark.parametrize( + "pattern, name, match", + [ + ("test.*", "test", True), + ("test.*", "test1", True), + ("test.*", "wrongtest", False), + (".*test.*", "wrongtest", True), + ], +) +def test_resource_match_with_name_filter(pattern, name, match): + p = Permission(name="test", name_pattern=pattern) + for t in ALL_RESOURCE_TYPES: + resource = Mock(spec=t) + resource.name = name + assertpy.assert_that(p.match_resource(resource)).is_equal_to(match) + + +@pytest.mark.parametrize( + ("required_tags, tags, result"), + [ + ({"owner": "dev"}, {}, False), + ({"owner": "dev"}, {"owner": "master"}, False), + ({"owner": "dev"}, {"owner": "dev", "other": 1}, True), + ({"owner": "dev", "dep": 1}, {"owner": "dev", "other": 1}, False), + ({"owner": "dev", "dep": 1}, {"owner": "dev", "other": 1, "dep": 1}, True), + ], +) +def test_resource_match_with_tags(required_tags, tags, result): + # Missing tags + p = Permission(name="test", required_tags=required_tags) + for t in ALL_RESOURCE_TYPES: + resource = Mock(spec=t) + resource.name = "test" + resource.required_tags = tags + assertpy.assert_that(p.match_resource(resource)).is_equal_to(result) + + +@pytest.mark.parametrize( + ("permitted_actions, requested_actions, result"), + [(ALL_ACTIONS, [a], True) for a in AuthzedAction.__members__.values()] + + [ + ( + [AuthzedAction.CREATE, AuthzedAction.DELETE], + [AuthzedAction.CREATE, AuthzedAction.DELETE], + True, + ), + ([AuthzedAction.CREATE, AuthzedAction.DELETE], [AuthzedAction.CREATE], True), + ([AuthzedAction.CREATE, AuthzedAction.DELETE], [AuthzedAction.DELETE], True), + ([AuthzedAction.CREATE, AuthzedAction.DELETE], [AuthzedAction.UPDATE], False), + ( + [AuthzedAction.CREATE, AuthzedAction.DELETE], + [AuthzedAction.CREATE, AuthzedAction.DELETE, AuthzedAction.UPDATE], + False, + ), + ], +) +def test_match_actions(permitted_actions, requested_actions, result): + p = Permission(name="test", actions=permitted_actions) + assertpy.assert_that( + p.match_actions(requested_actions=requested_actions) + ).is_equal_to(result) diff --git a/sdk/python/tests/unit/permissions/test_policy.py b/sdk/python/tests/unit/permissions/test_policy.py new file mode 100644 index 0000000000..4e78282d4f --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_policy.py @@ -0,0 +1,44 @@ +import assertpy +import pytest + +from feast.permissions.policy import AllowAll, RoleBasedPolicy +from feast.permissions.user import User + + +@pytest.mark.parametrize( + "username", + [("r"), ("w"), ("rw"), ("missing")], +) +def test_allow_all(users, username): + user = users.get(username, User(username, [])) + assertpy.assert_that(AllowAll.validate_user(user)).is_true() + + +@pytest.mark.parametrize( + "required_roles, username, result", + [ + (["reader"], "r", True), + (["writer"], "r", False), + (["reader", "writer"], "r", True), + (["writer", "updater"], "r", False), + (["reader"], "w", False), + (["writer"], "w", True), + (["reader", "writer"], "w", True), + (["reader", "updater"], "w", False), + (["reader"], "rw", True), + (["writer"], "rw", True), + (["reader", "writer"], "rw", True), + (["updater"], "rw", False), + ], +) +def test_role_based_policy(users, required_roles, username, result): + user = users.get(username) + policy = RoleBasedPolicy(roles=required_roles) + + validate_result, explain = policy.validate_user(user) + assertpy.assert_that(validate_result).is_equal_to(result) + + if result is True: + assertpy.assert_that(explain).is_equal_to("") + else: + assertpy.assert_that(len(explain)).is_greater_than(0) diff --git a/sdk/python/tests/unit/permissions/test_security_manager.py b/sdk/python/tests/unit/permissions/test_security_manager.py new file mode 100644 index 0000000000..192542da78 --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_security_manager.py @@ -0,0 +1,83 @@ +import assertpy +import pytest + +from feast.permissions.action import READ, AuthzedAction +from feast.permissions.security_manager import assert_permissions, permitted_resources + + +@pytest.mark.parametrize( + "username, requested_actions, allowed, allowed_single, raise_error_in_assert, raise_error_in_permit", + [ + (None, [], False, [False, False], [True, True], False), + ("r", [AuthzedAction.DESCRIBE], True, [True, True], [False, False], False), + ("r", [AuthzedAction.UPDATE], False, [False, False], [True, True], False), + ("w", [AuthzedAction.DESCRIBE], False, [False, False], [True, True], False), + ("w", [AuthzedAction.UPDATE], False, [True, True], [False, False], False), + ("rw", [AuthzedAction.DESCRIBE], False, [True, True], [False, False], False), + ("rw", [AuthzedAction.UPDATE], False, [True, True], [False, False], False), + ( + "rw", + [AuthzedAction.DESCRIBE, AuthzedAction.UPDATE], + False, + [False, False], + [True, True], + True, + ), + ( + "admin", + [AuthzedAction.DESCRIBE, AuthzedAction.UPDATE], + False, + [False, True], + [True, False], + True, + ), + ( + "admin", + READ + [AuthzedAction.UPDATE], + False, + [False, False], + [True, True], + True, + ), + ], +) +def test_access_SecuredFeatureView( + security_manager, + feature_views, + users, + username, + requested_actions, + allowed, + allowed_single, + raise_error_in_assert, + raise_error_in_permit, +): + sm = security_manager + resources = feature_views + + user = users.get(username) + sm.set_current_user(user) + + result = [] + if raise_error_in_permit: + with pytest.raises(PermissionError): + result = permitted_resources(resources=resources, actions=requested_actions) + else: + result = permitted_resources(resources=resources, actions=requested_actions) + + if allowed: + assertpy.assert_that(result).is_equal_to(resources) + elif not raise_error_in_permit: + filtered = [r for i, r in enumerate(resources) if allowed_single[i]] + assertpy.assert_that(result).is_equal_to(filtered) + + for i, r in enumerate(resources): + if allowed_single[i]: + result = assert_permissions(resource=r, actions=requested_actions) + assertpy.assert_that(result).is_equal_to(r) + elif raise_error_in_assert[i]: + with pytest.raises(PermissionError): + assert_permissions(resource=r, actions=requested_actions) + else: + result = assert_permissions(resource=r, actions=requested_actions) + assertpy.assert_that(result).is_none() diff --git a/sdk/python/tests/unit/permissions/test_user.py b/sdk/python/tests/unit/permissions/test_user.py new file mode 100644 index 0000000000..cce318cba7 --- /dev/null +++ b/sdk/python/tests/unit/permissions/test_user.py @@ -0,0 +1,34 @@ +import assertpy +import pytest + +from feast.permissions.user import User + + +@pytest.fixture(scope="module") +def users(): + users = [] + users.append(User("a", ["a1", "a2"])) + users.append(User("b", ["b1", "b2"])) + return dict([(u.username, u) for u in users]) + + +@pytest.mark.parametrize( + "username, roles, result", + [ + ("c", [], False), + ("a", ["b1"], False), + ("a", ["a1", "b1"], True), + ("a", ["a1"], True), + ("a", ["a1", "a2"], True), + ("a", ["a1", "a2", "a3"], True), + ("b", ["a1", "a3"], False), + ("b", ["a1", "b1"], True), + ("b", ["b1", "b2"], True), + ("b", ["b1", "b2", "b3"], True), + ], +) +def test_user_has_matching_role(users, username, roles, result): + user = users.get(username, User(username, [])) + assertpy.assert_that(user.has_matching_role(requested_roles=roles)).is_equal_to( + result + ) diff --git a/sdk/python/tests/unit/test_offline_server.py b/sdk/python/tests/unit/test_offline_server.py index 5991e7450d..237e2ecad4 100644 --- a/sdk/python/tests/unit/test_offline_server.py +++ b/sdk/python/tests/unit/test_offline_server.py @@ -14,7 +14,7 @@ RemoteOfflineStore, RemoteOfflineStoreConfig, ) -from feast.offline_server import OfflineServer +from feast.offline_server import OfflineServer, _init_auth_manager from feast.repo_config import RepoConfig from tests.utils.cli_repo_creator import CliRunner @@ -26,6 +26,7 @@ def empty_offline_server(environment): store = environment.feature_store location = "grpc+tcp://localhost:0" + _init_auth_manager(store=store) return OfflineServer(store=store, location=location) @@ -102,6 +103,8 @@ def test_remote_offline_store_apis(): with tempfile.TemporaryDirectory() as temp_dir: store = default_store(str(temp_dir)) location = "grpc+tcp://localhost:0" + + _init_auth_manager(store=store) server = OfflineServer(store=store, location=location) assertpy.assert_that(server).is_not_none diff --git a/sdk/python/tests/utils/auth_permissions_util.py b/sdk/python/tests/utils/auth_permissions_util.py new file mode 100644 index 0000000000..3b5e589812 --- /dev/null +++ b/sdk/python/tests/utils/auth_permissions_util.py @@ -0,0 +1,245 @@ +import os +import subprocess + +import yaml +from keycloak import KeycloakAdmin + +from feast import ( + FeatureStore, + RepoConfig, +) +from feast.infra.registry.remote import RemoteRegistryConfig +from feast.permissions.permission import Permission +from feast.wait import wait_retry_backoff +from tests.utils.cli_repo_creator import CliRunner +from tests.utils.http_server import check_port_open + +PROJECT_NAME = "feast_test_project" + + +def include_auth_config(file_path, auth_config: str): + with open(file_path, "r") as file: + existing_content = yaml.safe_load(file) + new_section = yaml.safe_load(auth_config) + if isinstance(existing_content, dict) and isinstance(new_section, dict): + existing_content.update(new_section) + else: + raise ValueError("Both existing content and new section must be dictionaries.") + with open(file_path, "w") as file: + yaml.safe_dump(existing_content, file, default_flow_style=False) + print(f"Updated auth section at {file_path}") + + +def default_store( + temp_dir, + auth_config: str, + permissions: list[Permission], +): + runner = CliRunner() + result = runner.run(["init", PROJECT_NAME], cwd=temp_dir) + repo_path = os.path.join(temp_dir, PROJECT_NAME, "feature_repo") + assert result.returncode == 0 + + include_auth_config( + file_path=f"{repo_path}/feature_store.yaml", auth_config=auth_config + ) + + result = runner.run(["--chdir", repo_path, "apply"], cwd=temp_dir) + assert result.returncode == 0 + + fs = FeatureStore(repo_path=repo_path) + + fs.apply(permissions) + + return fs + + +def start_feature_server(repo_path: str, server_port: int, metrics: bool = False): + host = "0.0.0.0" + cmd = [ + "feast", + "-c" + repo_path, + "serve", + "--host", + host, + "--port", + str(server_port), + ] + feast_server_process = subprocess.Popen( + cmd, stdout=subprocess.PIPE, stderr=subprocess.PIPE + ) + _time_out_sec: int = 60 + # Wait for server to start + wait_retry_backoff( + lambda: (None, check_port_open(host, server_port)), + timeout_secs=_time_out_sec, + timeout_msg=f"Unable to start the feast server in {_time_out_sec} seconds for remote online store type, port={server_port}", + ) + + if metrics: + cmd.append("--metrics") + + # Check if metrics are enabled and Prometheus server is running + if metrics: + wait_retry_backoff( + lambda: (None, check_port_open("localhost", 8000)), + timeout_secs=_time_out_sec, + timeout_msg="Unable to start the Prometheus server in 60 seconds.", + ) + else: + assert not check_port_open( + "localhost", 8000 + ), "Prometheus server is running when it should be disabled." + + yield f"http://localhost:{server_port}" + + if feast_server_process is not None: + feast_server_process.kill() + + # wait server to free the port + wait_retry_backoff( + lambda: ( + None, + not check_port_open("localhost", server_port), + ), + timeout_msg=f"Unable to stop the feast server in {_time_out_sec} seconds for remote online store type, port={server_port}", + timeout_secs=_time_out_sec, + ) + + +def get_remote_registry_store(server_port, feature_store): + registry_config = RemoteRegistryConfig( + registry_type="remote", path=f"localhost:{server_port}" + ) + + store = FeatureStore( + config=RepoConfig( + project=PROJECT_NAME, + auth=feature_store.config.auth, + registry=registry_config, + provider="local", + entity_key_serialization_version=2, + ) + ) + return store + + +def setup_permissions_on_keycloak(keycloak_admin: KeycloakAdmin): + new_client_id = "feast-integration-client" + new_client_secret = "feast-integration-client-secret" + # Create a new client + client_representation = { + "clientId": new_client_id, + "secret": new_client_secret, + "enabled": True, + "directAccessGrantsEnabled": True, + "publicClient": False, + "redirectUris": ["*"], + "serviceAccountsEnabled": True, + "standardFlowEnabled": True, + } + keycloak_admin.create_client(client_representation) + + # Get the client ID + client_id = keycloak_admin.get_client_id(new_client_id) + + # Role representation + reader_role_rep = { + "name": "reader", + "description": "feast reader client role", + "composite": False, + "clientRole": True, + "containerId": client_id, + } + keycloak_admin.create_client_role(client_id, reader_role_rep, True) + reader_role_id = keycloak_admin.get_client_role( + client_id=client_id, role_name="reader" + ) + + # Role representation + writer_role_rep = { + "name": "writer", + "description": "feast writer client role", + "composite": False, + "clientRole": True, + "containerId": client_id, + } + keycloak_admin.create_client_role(client_id, writer_role_rep, True) + writer_role_id = keycloak_admin.get_client_role( + client_id=client_id, role_name="writer" + ) + + # Mapper representation + mapper_representation = { + "name": "client-roles-mapper", + "protocol": "openid-connect", + "protocolMapper": "oidc-usermodel-client-role-mapper", + "consentRequired": False, + "config": { + "multivalued": "true", + "userinfo.token.claim": "true", + "id.token.claim": "true", + "access.token.claim": "true", + "claim.name": "roles", + "jsonType.label": "String", + "client.id": client_id, + }, + } + + # Add predefined client roles mapper to the client + keycloak_admin.add_mapper_to_client(client_id, mapper_representation) + + reader_writer_user = { + "username": "reader_writer", + "enabled": True, + "firstName": "reader_writer fn", + "lastName": "reader_writer ln", + "email": "reader_writer@email.com", + "emailVerified": True, + "credentials": [{"value": "password", "type": "password", "temporary": False}], + } + reader_writer_user_id = keycloak_admin.create_user(reader_writer_user) + keycloak_admin.assign_client_role( + user_id=reader_writer_user_id, + client_id=client_id, + roles=[reader_role_id, writer_role_id], + ) + + reader_user = { + "username": "reader", + "enabled": True, + "firstName": "reader fn", + "lastName": "reader ln", + "email": "reader@email.com", + "emailVerified": True, + "credentials": [{"value": "password", "type": "password", "temporary": False}], + } + reader_user_id = keycloak_admin.create_user(reader_user) + keycloak_admin.assign_client_role( + user_id=reader_user_id, client_id=client_id, roles=[reader_role_id] + ) + + writer_user = { + "username": "writer", + "enabled": True, + "firstName": "writer fn", + "lastName": "writer ln", + "email": "writer@email.com", + "emailVerified": True, + "credentials": [{"value": "password", "type": "password", "temporary": False}], + } + writer_user_id = keycloak_admin.create_user(writer_user) + keycloak_admin.assign_client_role( + user_id=writer_user_id, client_id=client_id, roles=[writer_role_id] + ) + + no_roles_user = { + "username": "no_roles_user", + "enabled": True, + "firstName": "no_roles_user fn", + "lastName": "no_roles_user ln", + "email": "no_roles_user@email.com", + "emailVerified": True, + "credentials": [{"value": "password", "type": "password", "temporary": False}], + } + keycloak_admin.create_user(no_roles_user) diff --git a/setup.py b/setup.py index 6fb5bfee61..d53aee1002 100644 --- a/setup.py +++ b/setup.py @@ -61,6 +61,9 @@ "dask[dataframe]>=2024.2.1", "prometheus_client", "psutil", + "bigtree>=0.19.2", + "pyjwt", + "kubernetes<=20.13.0", ] GCP_REQUIRED = [ @@ -183,6 +186,7 @@ "pytest-env", "Sphinx>4.0.0,<7", "testcontainers==4.4.0", + "python-keycloak==4.2.2", "pre-commit<3.3.2", "assertpy==1.1", "pip-tools",