Connect to Redshift (in private Subnet) using Lambda

0

I have a Redshift cluster in private subnet and i am using Pyscopg2 library to connect to it from Lambda . I have deployed the Lambda inside the VPC and subnets and security group appropriately.

When I use the following code to connect ,

try:
    client = boto3.client('redshift')
    creds = client.get_cluster_credentials(
      DbUser=REDSHIFT_USER,
      DbName=REDSHIFT_DATABASE,
      ClusterIdentifier=REDSHIFT_CLUSTER,
      DurationSeconds=3600)
  except Exception as ERROR:
    print("Credentials Issue: " + ERROR)
  try:

    conn = psycopg2.connect(
      dbname=REDSHIFT_DATABASE,
      user=creds['DbUser'],
      password=creds['DbPassword'],
      port=REDSHIFT_PORT,
      host=REDSHIFT_ENDPOINT)
  except Exception as ERROR:
    print("Connection Issue: ")
    raise

I get the folliowing error

{
  "errorMessage": "FATAL:  no pg_hba.conf entry for host \"::ffff:10.209.4.117\", user \"IAM:awsuser\", database \"dev\", SSL off\n",
  "errorType": "OperationalError",

I have solved this error in SQL clients by adding the following properties AuthMech=REQUIRE&ssl=TRUE

However , i am not sure how can i make the lambda code connect to Redshift.

AWS
专家
已提问 4 年前1650 查看次数
1 回答
0
已接受的回答

Check to see if the Lambda is running in the default VPC. It should be running in the same VPC as the Redshift. Please read more here, you will find a tutorial with RDS use pg instead of pymysql to connect to Redshift, pycopg2 is also fine.

AWS
Kunal_G
已回答 4 年前

您未登录。 登录 发布回答。

一个好的回答可以清楚地解答问题和提供建设性反馈,并能促进提问者的职业发展。

回答问题的准则