I have very little knowledge of rackspace..My client has setup one server at rackspace cloud and saying that he created one database and I have to retrieve data from there , while logging into https://mycloud.rackspace.com/ , I found that server is created at
PublicNet (Internet) 166.78.105.176
and I also can access database via phpmyadmin http://"link"/phpmyadmin
But the problem is I need instance Id of that database from cloud control panel to connect to database as said in http://www.rackspace.com/knowledge_center/article/connecting-to-your-cloud-database , which I am not able to get as I am not able to see database in cloud control panel..
Please help..am I in right direction ? or I am totally in wrong path.
Any suggestions would be helpful me..
Related
I've been trying to use Continuous Query Notification (CQN) in python script to get notification from database about changes that were made to a specific table.
I have followed tutorial from this link here
https://python-oracledb.readthedocs.io/en/latest/user_guide/cqn.html
Connection to oracle database was successful and I can query the table to get results but I can't get any message from callback function that looks like this
def cqn_callback(message):
print("Notification:")
for query in message.queries:
for tab in query.tables:
print("Table:", tab.name)
print("Operation:", tab.operation)
for row in tab.rows:
if row.operation & oracledb.OPCODE_INSERT:
print("INSERT of rowid:", row.rowid)
if row.operation & oracledb.OPCODE_DELETE:
print("DELETE of rowid:", row.rowid)
subscr = connection.subscribe(callback=cqn_callback,
operations=oracledb.OPCODE_INSERT | oracledb.OPCODE_DELETE,
qos=oracledb.SUBSCR_QOS_QUERY | oracledb.SUBSCR_QOS_ROWIDS)
subscr.registerquery("select * from regions")
input("Hit enter to stop CQN demo\n")
I can see that the registration was created in the database after I run the script but I just don't receive any message about insert or delete after I perform any of those operations through SQL* Plus or SQL Developer.
I am reading other questions and blogs about this functionality but currently without success, so if anyone has any recommendations or has encountered similar problem, please comment or answer here.
Oracle database 12C from docker
Python version is 3.10.7
I am running it in thick mode and for oracle client libraries I am using this command
oracledb.init_oracle_client(lib_dir = ".../instantclient_21_3"
P.S This is my first time posting a question here so if I didn't correctly follow a structure or rules of asking a question please correct me, thanks in advance :)
Please take a look at the requirements for CQN in the documentation. Note in particular the fact that the database needs to connect back to the application. If this cannot be done no notifications will take place even though the registration is successful with the database. With Oracle Database 19.4 a new mode was introduced which eliminates this requirement, but since you are still using 12c that won't work for you. You will need to ensure that the database can connect back to the application -- opening up any ports, ensuring that an IP address is directly specified in the parameters or an IP address can be looked up from the name of the client machine connecting to the database, etc.
Moving this question from DevOps Stack Exchange where it got only 5 views in 2 days:
I would like to query an Azure Database for MySQL Single Server.
I normally interact with this database using a universal database tool (dBeaver) installed onto an Azure VM. Now I would like to interact with this database using Python from outside Azure. Ultimately I would like to write an API (FastAPI) allowing multiple users to connect to the database.
I ran a simple test from a Jupyter notebook, using SQLAlchemy as my ORM and specifying the pem certificate as a connection argument:
import pandas as pd
from sqlalchemy import create_engine
cnx = create_engine('mysql://XXX', connect_args={"ssl": {"ssl_ca": "mycertificate.pem"}})
I then tried reading data from a specific table (e.g. mytable):
df = pd.read_sql('SELECT * FROM mytable', cnx)
Alas I ran into the following error:
'Client with IP address 'XX.XX.XXX.XXX' is not allowed to connect to
this MySQL server'.
According to my colleagues, a way to fix this issue would be to whitelist my IP address.
While this may be an option for a couple of users with static IP addresses I am not sure whether it is a valid solution in the long run.
Is there a better way to access an Azure Database for MySQL Single Server from outside Azure?
As mentioned in comments, you need to whitelist the IP address ranges(s) in the Azure portal for your MySQL database resource. This is a well accepted and secure approach.
I am trying to connect snowflake to Python using my user snowflake credentials but its getting error while executing(I have cross checked my sf credential everything is perfect), Later i have tried to use my colleagues user credentials to connect its working(used the same code but changed the credentials) no error and snowflake is connecting to his account. can anyone help me where would be the problemerror details
There are a couple of URLs that can be used to access your Snowflake account. I recommend SNOWFLAKE_DEPLOYMENT_REGIONLESS
You can run this query on your account to find them:
-- account URLs
select t.value:type::varchar as type,
t.value:host::varchar as host,
t.value:port as port
from table(flatten(input => parse_json(system$whitelist()))) as t
where t.value:type::varchar like 'SNOWFLAKE%';
There are several factors that could be impacting whether or not you can connect, including network policies or firewalls.
You can use SnowCD (Connectivity Diagnostic Tool) to rule out there are any issues connecting to Snowflake from your machine.
If you can connect from your local machine, but are attempting to via python from a remote machine, the issue is very likely a network policy (snowflake defined firewall) has been set to restrict IP addresses that can connect to Snowflake by your Snowflake admin.
If SnowCD reports no errors and network policies are ruled out, reach out to Snowflake support for further investigation.
If for some reason hyphens are not supported in the URL, you can replace them with underscores.
organization_name-account_name (for most URLs and other general
purpose usage)
organization_name_account_name (for scenarios/features where hyphens
are not supported in URLs)
organization_name.account_name (for SQL commands and operations)
Where:
organization_name is the name of your Snowflake organization.
account_name is the unique name of your account within your
organization.
I have a login interface, I used tkinter and sqlite3 as database, everything works fine, in my data base stored locally in my PC I've created an username and password which i use to login, I would like to know if there is a way to store only my sqlite.db in a cloud or some server and i can still be able to login with my tkinter interface in any computer using my databese in the cloud.
this is what im using to connect my sqlite database locally and works smootly.
conn = sqlite3.connect('login_file.db')
c = conn.cursor()
user = entry_usuario.get()
contra = entry_contrasena.get()
c.execute('SELECT * FROM superusuario WHERE usuario = ? AND password = ?', (user, contra))
if c.fetchall():
messagebox.showinfo(title='login correcto', message='usuario y contraseƱa correctos')
else:
messagebox.showerror(tittle=None, message='ContraseƱa Incorrecta')
c.close()
Psdt: I was trying to use firebase authentication to link with my tkinter login interface, but i wasnt succesful with it (i dont know how to replace it), maybe i should use another server?, any advise please let me know, thanks in advance have a good day
sqlite is a file based database, with no in built network server. So your application needs to access it as a file in a known location.
The only way to do this without a server side function is to host it on a remote network drive - and mount it on your pc; but to do that you leave your data exposed since sqlite data bases aren't password protected in any form - anyone could download the database and open it.
To protect it you would need to implement a network server (maybe on an AWS server - or similar) which gave protected access and exposed the data as a REST API, or even better, don't use sqlite if you want a remote database.
I read all documentation related to connecting to MysQL hosted in Cloud SQL from GCF and still can't connect. Also, tried all hints in documentation of SQLAlchemy related to this.
I am using the following connection
con = 'mysql+pymysql://USER:PASSWORD#/MY_DB?unix_socket=/cloudsql/Proj_ID:Zone:MySQL_Instance_ID'
mysqlEngine = sqlalchemy.create_engine(con)
The error I got was:
(pymysql.err.OperationalError) (2003, "Can't connect to MySQL server on 'localhost' ([Errno 111] Connection refused)") (Background on this error at: http://sqlalche.me/e/e3q8)
You need to make sure you are using the correct /cloudsql/<INSTANCE_CONNECTION_NAME> (This is in the format <PROJECT_ID>:<REGION>:<INSTANCE_ID>). This should be all that's needed if your Cloud SQL instance is in the same project and region as your Function.
The GCF docs also strongly recommends limiting your pool to a single connection. This means you should set both pool_size=1 and max_overflow=0 in your engine settings.
If you would like to see an example of how to set these settings, check out this sample application on Github.
I believe that your problem is with the Connection_name represented by <PROJECT_ID>:<REGION>:<INSTANCE_ID> at the end of the con string variable.
Which by the way should be quoted:
con = 'mysql+pymysql://USER:PASSWORD#/MY_DB?unix_socket=/cloudsql/<PROJECT_ID>:<REGION>:<INSTANCE_ID>'
Check if you are writing it right with this command:
gcloud sql instances describe <INSTANCE_ID> | grep connectionName
If this is not the case, keep in mind these considerations present in the Cloud Functions official documentation:
First Generation MySQL instances must be in the same region as your Cloud Function. Second Generation MySQL instances as well as PostgreSQL instances work with Cloud Functions in any region.
Your Cloud Function has access to all Cloud SQL instances in your project. You can access Second Generation MySQL instances as well as PostgreSQL instances in other projects if your Cloud Function's service account (listed on the Cloud Function's General tab in the GCP Console) is added as a member in IAM on the project with the Cloud SQL instance(s) with the Cloud SQL Client role.
After a long thread with Google Support, we found the reason to be: simply we should enable public access to Cloud SQL without any firewall rule. It is undocumented and can drive you crazy, but the silver bullet for the support team is to say: it is in beta!
I was having this issue. Service account was correct, had the correct permissions, same exact connection string as in my App Engine application. Still got this in the logs.
dial unix /cloudsql/project:region:instance connect: no such file or directory
Switching from 2nd generation Cloud Function to 1st generation solved it. Didn't see it documented anywhere that 2nd couldn't connect to Cloud SQL instances.