Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

OIDC with external role mapper does not work after upgrading to 0.25.1 #1946

Open
BernhardBerbuir opened this issue Sep 16, 2024 · 2 comments

Comments

@BernhardBerbuir
Copy link

I'm using AKHQ with an external OIDC mapper which provides the group of an user. After upgrading from 0.25.0 to 0.25.1 I get the following error message from the web interface / server log / api endpoint:

{
  "message": "Internal Server Error: Cannot invoke \"String.equals(Object)\" because the return value of \"org.akhq.configs.security.Group.getRole()\" is null",
  "_links": {
    "self": {
      "href": "/api/kafka-dev/topic",
      "templated": false
    }
  },
  "_embedded": {
    "stacktrace": [
      {
        "message": "java.lang.NullPointerException: Cannot invoke \"String.equals(Object)\" because the return value of \"org.akhq.configs.security.Group.getRole()\" is null\n\tat org.akhq.controllers.AbstractController.lambda$checkIfClusterAndResourceAllowed$9(AbstractController.java:152)\n\tat java.base/java.util.stream.ReferencePipeline$2$1.accept(Unknown Source)\n\tat java.base/java.util.Spliterators$IteratorSpliterator.tryAdvance(Unknown Source)\n\tat java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)\n\tat java.base/java.util.stream.MatchOps$MatchOp.evaluateSequential(Unknown Source)\n\tat java.base/java.util.stream.MatchOps$MatchOp.evaluateSequential(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)\n\tat java.base/java.util.stream.ReferencePipeline.anyMatch(Unknown Source)\n\tat org.akhq.controllers.AbstractController.lambda$checkIfClusterAndResourceAllowed$12(AbstractController.java:154)\n\tat java.base/java.util.stream.ReferencePipeline$2$1.accept(Unknown Source)\n\tat java.base/java.util.ArrayList$ArrayListSpliterator.tryAdvance(Unknown Source)\n\tat java.base/java.util.stream.ReferencePipeline.forEachWithCancel(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.copyIntoWithCancel(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.copyInto(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.wrapAndCopyInto(Unknown Source)\n\tat java.base/java.util.stream.MatchOps$MatchOp.evaluateSequential(Unknown Source)\n\tat java.base/java.util.stream.MatchOps$MatchOp.evaluateSequential(Unknown Source)\n\tat java.base/java.util.stream.AbstractPipeline.evaluate(Unknown Source)\n\tat java.base/java.util.stream.ReferencePipeline.anyMatch(Unknown Source)\n\tat org.akhq.controllers.AbstractController.checkIfClusterAndResourceAllowed(AbstractController.java:157)\n\tat org.akhq.controllers.AbstractController.checkIfClusterAllowed(AbstractController.java:130)\n\tat org.akhq.controllers.TopicController.list(TopicController.java:101)\n\tat org.akhq.controllers.$TopicController$Definition$Exec.dispatch(Unknown Source)\n\tat io.micronaut.context.AbstractExecutableMethodsDefinition$DispatchedExecutableMethod.invokeUnsafe(AbstractExecutableMethodsDefinition.java:461)\n\tat io.micronaut.context.DefaultBeanContext$BeanContextUnsafeExecutionHandle.invokeUnsafe(DefaultBeanContext.java:4276)\n\tat io.micronaut.web.router.AbstractRouteMatch.execute(AbstractRouteMatch.java:271)\n\tat io.micronaut.http.server.RouteExecutor.executeRouteAndConvertBody(RouteExecutor.java:488)\n\tat io.micronaut.http.server.RouteExecutor.lambda$callRoute$6(RouteExecutor.java:465)\n\tat io.micronaut.core.execution.ExecutionFlow.lambda$async$1(ExecutionFlow.java:87)\n\tat io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141)\n\tat io.micrometer.core.instrument.Timer.lambda$wrap$0(Timer.java:193)\n\tat io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:211)\n\tat io.micrometer.core.instrument.composite.CompositeTimer.record(CompositeTimer.java:141)\n\tat io.micrometer.core.instrument.Timer.lambda$wrap$0(Timer.java:193)\n\tat io.micronaut.core.propagation.PropagatedContext.lambda$wrap$3(PropagatedContext.java:211)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(Unknown Source)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(Unknown Source)\n\tat java.base/java.lang.Thread.run(Unknown Source)\n"
      }
    ]
  }
} 

I'm using the following configuration (redacted):

akhq:
  clients-defaults:
    consumer:
      properties:
        default.api.timeout.ms: 60000
  connections:
    kafka-dev:
      ...
  pagination:
    page-size: 10
    threads: 2
  security:
    default-group: no-roles
    groups:
      no-roles:
        roles: []
    oidc:
      enabled: true
      providers:
        oidc:
          # Default group for all the user even unlogged user
          default-group: no-roles # group without any roles
          label: "Login with SSO"
          groups-field: groups
          username-field: email
    # see https://akhq.io/docs/configuration/authentifications/external.html
    rest:
      enabled: true
      url: http://localhost:8090/get-roles-and-attributes
    roles:
      # manage **all** topics and consumergroups
      Administrator:
        - actions: [ "READ", "UPDATE_OFFSET", "DELETE_OFFSET" ]
          resources: [ "CONSUMER_GROUP" ]
        - actions: [ "DELETE", "READ", "READ_CONFIG" ]
          resources: [ "TOPIC" ]
        - actions: [ "READ", "DELETE" ]
          resources: [ "TOPIC_DATA" ]
      # manage topics and consumergroups of a specific principal
      principal_Administrator:
        - actions: [ "READ", "UPDATE_OFFSET", "DELETE_OFFSET" ]
          resources: [ "CONSUMER_GROUP" ]
        - actions: [ "DELETE", "READ", "READ_CONFIG"  ]
          resources: [ "TOPIC" ]
        - actions: [ "READ", "DELETE" ]
          resources: [ "TOPIC_DATA" ]
      # read messages from explicitly allowed topics
      principal_ReadOnly:
        - actions: [ "READ" ]
          resources: [ "CONSUMER_GROUP" ]
        - actions: [ "READ", "READ_CONFIG" ]
          resources: [ "TOPIC" ]
        - actions: [ "READ" ]
          resources: [ "TOPIC_DATA" ]
      # read messages from **all** topics
      ReadOnly:
        - actions: [ "READ" ]
          resources: [ "CONSUMER_GROUP" ]
        - actions: [ "READ", "READ_CONFIG" ]
          resources: [ "TOPIC" ]
        - actions: [ "READ" ]
          resources: [ "TOPIC_DATA" ]
      # general read permissions of meta data for every user
      # (akhq-oidc-mapper always adds this role to a user)
      AuthenticatedUser:
        - actions: [ "READ" ]
          resources: [ "SCHEMA", "NODE", "ACL" ]
        - actions: [ "READ_CONFIG" ]
          resources: [ "NODE" ]
  topic-data:
    date-time-format: ISO
    size: 10
    sort: NEWEST
logger:
  levels:
    # enable for debugging problems
    # io.micronaut.security: TRACE
    # org.akhq.configs: INFO
micronaut:
  security:
    # URL of the deployed AKHQ application
    # REMARK: this URL must be registered as "Valid redirect URI" at the oidc client at Keycloak
    callback-uri: "http://localhost:5000/oauth/callback/oidc"
    enabled: true
    oauth2:
      clients:
        oidc:
          client-id: kafkanextplatform
          client-secret: "${OPENID_CLIENT_SECRET}"
          openid:
            ...
      enabled: true
    # see https://guides.micronaut.io/latest/micronaut-security-jwt-gradle-groovy.html#configuration
    token:
      jwt:
        signatures:
          secret:
            generator:
              # the secret requires an undocumented minimum length
              # => use 32 character or more
              secret: "${JWT_SECRET}"
  server:
    # use a different port in order to not interfere with Keycloak
    port: 5000

The login is working (/api/me):

{
  "logged": true,
  "username": "[email protected]",
  "roles": [
    {
      "resources": [
        "CONSUMER_GROUP"
      ],
      "actions": [
        "READ",
        "UPDATE_OFFSET",
        "DELETE_OFFSET"
      ],
      "patterns": [
        ".*"
      ],
      "clusters": [
        ".*"
      ]
    },
    ...
  ]
}

I cannot view any information about the Kafka cluster.

@AlexisSouquiere
Copy link
Collaborator

AlexisSouquiere commented Sep 16, 2024

I think that the only think that changed between the 2 versions is about the default group management.
Can you try to remove

    groups:
      no-roles:
        roles: []

And keep only:

akhq:
  security:
    default-group: no-roles

I don't know exactly how AKHQ handles a group with an empty roles array. If you give no-roles without defining the no-roles group it should work

@BernhardBerbuir
Copy link
Author

@AlexisSouquiere: Thank you for your quick reply. I removed the mentioned part of the configuration and now AKHQ is working again.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
Status: Backlog
Development

No branches or pull requests

2 participants