I lost admin access to eks cluster


We previously had pod scheduling issue with this cluster ,and we gave on solving the issue , so we removed nodes and kept the control plane for troubleshooting.

Trying to add worker nodes , I followed AWS documents but I made a mistake applying the following configmap :

apiVersion: v1
kind: ConfigMap
  name: aws-auth
  namespace: kube-system
  mapRoles: |
    - rolearn: 'arn:aws:iam::account-number:role/testyy-NodeInstanceRole-1FQVVVZPS0TDP'
      username: system:node:{{EC2PrivateDNSName}}
        - system:bootstrappers
        - system:nodes

That caused me to lose the cluster access , as it seem to have replaced the existing one , not sure why that happened , eks 1.21 so it might be the API version ?

We cannot delete the cluster now , is there a way to regain access to the cluster ? I can provide the cluster arn if anyone can help us regain access , thanks

asked a year ago1269 views
1 Answer


I see that you have mis-configured your aws-auth configmap and lost access to your cluster. You can regain access by editing the aws-auth configmap using the IAM user/role that you've used to create the EKS cluster.

The EKS cluster creator IAM user/role will always have access to your cluster even if the aws-auth configmap is mis-configured.

You can follow the steps provided in this document to access your EKS cluster using the EKS cluster creator IAM role/user and edit the aws-auth configmap as needed.

If you are still facing issues after following the mentioned steps, please feel free to open a support case and we'll be glad to help you!

profile pictureAWS
answered a year ago

You are not logged in. Log in to post an answer.

A good answer clearly answers the question and provides constructive feedback and encourages professional growth in the question asker.

Guidelines for Answering Questions