It is still a mystery to me how to expose Kafka to external network. I deployed Kafka on google cloud virtual machine using "Kafka certified by Bitnami", which runs Kafka 2.3.0 and Zookeeper 3.5.5. I opened firewall on ports 9092, 2181 (yes I can telnet it) and assigned static IP address to my machine. My server.properties file looks like this:
listeners=SASL_PLAINTEXT://:9092
advertised.listeners=SASL_PLAINTEXT://<static_ip>:9092
zookeeper.connect=localhost:9092
I tried to connect from Python:
from kafka.admin import KafkaAdminClient
import logging
logging.basicConfig(level=logging.DEBUG)
admin_client = KafkaAdminClient(
bootstrap_servers="<static_ip>:9092",
client_id='test',
sasl_plain_username="bitnami_provided_user",
sasl_plain_password="bitnami_provided_password"
)
What I get is this Error: Closing connection. KafkaConnectionError: socket disconnected.
I also changed SASL_PLAINTEXT to PLAINTEXT and then I get NoBrokersAvailable Error.
I went through this thread Not able to connect to kafka server on google compute engine from local machine and tried to make the same, but without luck. Is this really so difficult to expose Kafka to external network? I just want to access it from home and my local computer.
Related
I have been banging my head against a wall trying to get my streamlit app deployed on an ec2 instance so I can share with others, however I am having trouble connecting to my streamlit app via the browser. I noticed that on my local machine I also have the same issue, when I run the streamlit app locally, I am able to access my streamlit app via local host:
http://127.0.0.1:8501/
However the "External URL" and "Network URL" do not work and the page infinitely loads and eventually times out. Here is the external url & network url given by streamlit when you run streamlit run app.py
Collecting usage statistics. To deactivate, set browser.gatherUsageStats to False.
You can now view your Streamlit app in your browser.
Network URL: http://<network_ip>:8501
External URL: http://<external_ip>:8501
I can confirm that I have allowed port 8501 TCP inbound traffic on my local windows machine as well as on the ec2 instance.
Here is my security group config:
EC2 Security Group
How do I make my streamlit application accessible via the given Network URL & External URL by streamlit and not just via localhost:8501?
Would appreciate anyone's advice who has deployed a streamlit web app on an EC2 instance!
I have checked that:
My ec2 instance is listening on TCP port 8501
That the streamlit service is running by running "ps aux | grep streamlit"
Restarting my instance
So this was a bit of a face-palm moment for me, the problem I was encountering was as a result of the internal firewall blocking connections to port 8501 on my corporate network.
I have created a site that connect ZKTECO K40 device. The connection Method is pretty simple:
from zk import ZK, const
zk = ZK('192.168.1.13', port=4370, timeout=5)
conn = zk.connect()
This makes a connection while running from a local host connecting in the same network.
But after hosting the site, the site is not able to ping to the device and cannot connect to the device.
How can I connect to the device connected on pc local network from a hosted django site? I have my site hosted on Cpanel.
From the server, a machine which is connected to some local network can not be connected. You have to use cams biometric api for communicating your biometric device from the remotely hosted application.
You can use https://www.zerotier.com/ - to expose the device over the internet and then connect it. It's not advisable. but thats one way.
I am running Ubuntu 18.04,
and am following this tutorial to make a flask server.
https://www.digitalocean.com/community/tutorials/how-to-serve-flask-applications-with-gunicorn-and-nginx-on-ubuntu-18-04
And if I log off, and try to log back in,
I am unable to SSH into my instance,
and it give me this error:
Connection via Cloud Identity-Aware Proxy Failed
Code: 4003
Reason: failed to connect to backend
You may be able to connect without using the Cloud Identity-Aware Proxy.
And I have tried creating an instance from an image of the original.
I've tried adjusting my firewall, and then ssh into another port.
I've tried to connect without using the Cloud Identity-Aware Proxy.
And it happens every time I set up a new machine AFTER I set up Nginx.
There are some other people on here who have encountered the same problem like Error 4003: can't ssh login into the instance that i created in google cloud platform
and
Can't SSH into Google Cloud VM
but neither thread has really any helpful answers. Has anyone who's encountered this been able to fix it?
Turns out the issue was the firewall.
I ufw, but forgot to allow ssh connections, and I locked myself out.
I created an entirely new machine, and allowed 22 and ssh from the get go.
I have a python application hosted by a node.js frontend. I am running that on a linux vm on Google Cloud virtual machine (GCP).
node appname runserver 8080 command starts local server within VM but I am wondering what would be step by step process to access it via a DNS from outside world.
Or if there is better approach to host python ML applications behind a web interface, then please suggest.
You need to use forever for this.
Forever will move the node process to the background and service will keep running in the background even if you log out of the server. And In order to access from outside point a DNS domain to this IP address of the machine and then Proxy Pass the request on port 80 to the port your service is running on.
Then you will be able to access it via domain name.
Look for ProxyPass directive in the Http server. That would work for you. :D
I am using the Azure IoT Hub Client SDK for Python. I am using a slightly modified version of the sample script from the github repo to upload files to the IoT Hub. Everything works fine as long as I do not have to use a proxy for outgoing connections.
I tried to understand how to configurate a proxy for this, but I did not find anything for the Python SDK. I searched also in the other SDKs and found some ProxySettings in the iothub_client_options.h of the C SDK. But I do not know how to set these settings in the python client (in case the settings are actually working).
I also found an issue that the connection over websockets needs some special format of the Linux environment variables. But I do not use websockets.
I tried to run my script both in Windows and Linux environments where the proxy system settings are correctly configured (Win: Internet settings, Linux: environment variables).
Is there any documentation on this topic? Does anybody how to configure a proxy either on windows or on linux?
Per my experience, I think you can run the python script using Azure IoTHub Client SDK without any proxy settings to communicate with Azure IoT Hub if the OS configured correctly the proxy.
However, there are some notes which need to be focused by using different protocol (such as HTTP, Socks, etc) configured in proxy server, as below.
Normally, the proxy server was configured for working on HTTP protocol which only allow the HTTP communication. So if using IoTHub Client within HTTP mode, the script will works fine, but not works within AMQP/MQTT mode.
If the proxy server was configured for working on Socks protocol, such as Socks4/Socks5, the script within any mode will works fine, because the Socks protocol just transmit datagram, not check the protocol type.
So please check which protocols be supported in your proxy server, then to use HTTP mode or configure Socks protocol for proxy to make the script works.