Questions tagged with Database

Content language: English

Sort by most recent

Browse through the questions and answers listed below or filter and sort to narrow down your results.

Currently, the Create RDS read replica screen doesn't allow choosing a custom parameter group (and option group) on creation so a new read replica with a smaller instance class ends up getting stuck in "incompatible-parameters" mode because of high values of buffer related parameters in the original primary instance's parameter group, so I have to create the new replica first with the original instance class of the primary instance, change the parameter group and then resize the replica to a smaller instance class. Also, replication errors can occur while the primary parameter group is applied on the read replica because of MEMORY engine tables which have out of sync rows between the instances, which were meant to be excluded from replication using the read replica custom parameter group.
1
answers
0
votes
16
views
asked 22 days ago
Error while fetching the data from dynamo db from athena Error: GENERIC_USER_ERROR: Encountered an exception[java.lang.ClassCastException] from your LambdaFunction[arn:aws:lambda:ap-south-1:458993339053:function:dynamodbdata] executed in context[S3SpillLocation{bucket='swaasa-athena-db-spill-ap-south-1', key='athena-spill/d38cdaf7-7fb2-48b9-acc8-9139de2fe525/821dc5fc-f893-40f9-b25e-96b832e55c4a', directory=true}] with message[class java.lang.String cannot be cast to class java.math.BigDecimal (java.lang.String and java.math.BigDecimal are in module java.base of loader 'bootstrap')] Tried using the try_cast but still seeing the issue query id: 0a47ace5-b8fc-46cb-8ff5-23194925882e
1
answers
0
votes
29
views
asked 23 days ago
Hello, I am desperately needing help connecting to Amazon Redshift server using an odbc driver. I have followed the "Configuring an ODBC connection" seen here: https://docs.aws.amazon.com/redshift/latest/mgmt/configure-odbc-connection.html#obtain-odbc-url, but unable to figure out what's wrong with my setup. I have tried the code suggested by R like this: con <- DBI::dbConnect(odbc::odbc(), Driver = "/opt/amazon/redshift/lib/amazonredshiftodbc.dylib", Host = "rhealth-prod-4.cldcoxyrkflo.us-east-1.redshift.amazonaws.com", Schema = "dev", Port = 5439) I get the following error: Error: nanodbc/nanodbc.cpp:1118: 00000: [Amazon][ODBC] (11560) Unable to locate SQLGetPrivateProfileString function: [Amazon][DSI] An error occurred while attempting to retrieve the error message for key 'LibsLoadErr' with message parameters ['""'] and component ID 3: Message not found in file "/opt/amazon/redshift/ErrorMessages/en-US/ODBCMessages.xml" The odbc.ini and odbcinst.ini files are in my /User/ location so I shouldn't need to set environment variables, unless I am missing something, but here are my configuration files: odbc.ini: [ODBC Data Sources] Amazon_Redshift_dylib=Amazon Redshift DSN for macOS X [Amazon Redshift DSN for macOS X] Driver=/opt/amazon/redshift/lib/amazonredshiftodbc.dylib Host=rhealth-prod-4.cldcoxyrkflo.us-east-1.redshift.amazonaws.com Port=5439 Database=saf locale=en-US odbcinst.ini: [ODBC Drivers] Amazon_Redshift_dylib=Installed [Amazon_Redshift_dylib] Description=Amazon Redshift DSN for macOS X Driver=/opt/amazon/redshift/lib/amazonredshiftodbc.dylib Any insight would be greatly appreciated.
1
answers
0
votes
24
views
asked 24 days ago
I'm trying to connect to a postgresql RDS database using Datagrip and the AWS Toolkit plugin with IAM Authentication. The instructions on the subject are confusing and inconsistent and it's very unclear to me what I'm doing wrong. Here's what I did: - enabled IAM Auth on the RDS instance. - installed the AWS Toolkit and configured the access keys (I can access most services just fine, including viewing files on S3 through Datagrip. So that part is functional) - I created a policy that allows `rds-db:connect` action and assigned it to my user - I used the AWS explorer in Datagrip, selected the DB instance, clicked "connect with IAM auth" - when I run "test connection" I get this: ![Enter image description here](/media/postImages/original/IM296IK1IDQpO-H-PlNflQ8A) The details that the toolkit inserts are correct as far as I can tell. I've been googling, and asking, and trying different things. Some places suggest creating a role to connect through an EC2 instance which I completely don't understand (why would I need a complete instance? Why wouldn't I just connect directly?) It's very confusing, incredibly frustrating and I'm stumped. What am I missing?
1
answers
0
votes
24
views
yuvi
asked 25 days ago
We are planning on exploring MyRocks database engine of MySQL. I am aware that AWS RDS MariaDB supports it but haven't found any article if it's not supported by Aurora MySQL Compatible. Is this available and will it be available in the future? Thanks!
1
answers
0
votes
31
views
asked 25 days ago
Hi I have added an additional table to schema.graphql and run amplify publish. The table has been created but I am not seeing new methods in the mutations.js/queries.js/subscriptions.js files to reflect the new table. This is a vue js app. Is there any other command I need to run to update the local sources? Thanks
2
answers
0
votes
27
views
asked 25 days ago
Hi, I have a customer who wants to store time series data (industrial IoT) in Redshift. Then query/push that data to IoT SiteWise for equipment monitoring. My initial thought is - IoT SiteWise is already a time series data store with Hot & Cold Tiers, so it does not make sense to use Redshift as a data store. Instead they should use Redshift downstream of SiteWise. What do you folks think? Also, would appreciate some reasons for why Redshift is a good or bad candidate in this case. Thanks!
2
answers
0
votes
28
views
asked 25 days ago
We have made a new RDS Database and we have a public DNS record of it to connect. How can I find its private IP to connect from within the AWS network in same zone.
2
answers
0
votes
41
views
asked a month ago
I've been leveraging the pg_cron extension to run scheduled precompute jobs within the Postgres DB for over a year now, and have never had this issue. This extension comes with a table that logs the jobs that need to be run, as well as a function that allows you to easily create new jobs and insert them into the table. Today, when trying to schedule a new cron job in the Postgres DB, the DB crashed - terminating all existing connection to the DB. We haven't changed anything in the DB since yesterday, so it is confusing as to why this extension may be related to the crash. I'm hoping someone can help provide some insight.
0
answers
0
votes
14
views
ekblaze
asked a month ago
Is Postgres Serverless v2 instance good for production transactional system
3
answers
0
votes
28
views
asked a month ago
How do we enable TLS in ORACLE_FDW. As I understand we need to create a wallet and ingest the server side certificate. But how do we do that? Is that even possible?
0
answers
0
votes
10
views
asked a month ago
How can we change SQLNET.ENCRYPTION_CLIENT = Accepted to Requested in Oracle_FDW extension? Which parameter has to be changed?
0
answers
0
votes
4
views
asked a month ago