Connect to Redshift (in private Subnet) using Lambda

0

I have a Redshift cluster in private subnet and i am using Pyscopg2 library to connect to it from Lambda . I have deployed the Lambda inside the VPC and subnets and security group appropriately.

When I use the following code to connect ,

try:
    client = boto3.client('redshift')
    creds = client.get_cluster_credentials(
      DbUser=REDSHIFT_USER,
      DbName=REDSHIFT_DATABASE,
      ClusterIdentifier=REDSHIFT_CLUSTER,
      DurationSeconds=3600)
  except Exception as ERROR:
    print("Credentials Issue: " + ERROR)
  try:

    conn = psycopg2.connect(
      dbname=REDSHIFT_DATABASE,
      user=creds['DbUser'],
      password=creds['DbPassword'],
      port=REDSHIFT_PORT,
      host=REDSHIFT_ENDPOINT)
  except Exception as ERROR:
    print("Connection Issue: ")
    raise

I get the folliowing error

{
  "errorMessage": "FATAL:  no pg_hba.conf entry for host \"::ffff:10.209.4.117\", user \"IAM:awsuser\", database \"dev\", SSL off\n",
  "errorType": "OperationalError",

I have solved this error in SQL clients by adding the following properties AuthMech=REQUIRE&ssl=TRUE

However , i am not sure how can i make the lambda code connect to Redshift.

AWS
專家
已提問 4 年前檢視次數 1650 次
1 個回答
0
已接受的答案

Check to see if the Lambda is running in the default VPC. It should be running in the same VPC as the Redshift. Please read more here, you will find a tutorial with RDS use pg instead of pymysql to connect to Redshift, pycopg2 is also fine.

AWS
Kunal_G
已回答 4 年前

您尚未登入。 登入 去張貼答案。

一個好的回答可以清楚地回答問題並提供建設性的意見回饋,同時有助於提問者的專業成長。

回答問題指南