Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 9 years ago.
Improve this question
Some background information, I am new to provisioning and followed this tutorial:
https://www.digitalocean.com/community/articles/automated-provisioning-of-digitalocean-cloud-servers-with-salt-cloud-on-ubuntu-12-04
It explains how to setup a salt-master and salt-minion remotely by using a salt-cloud setup. It also refers a few security measures, setting up a different port for ssh, switching of root access and creating a different user with root permissions for usage, last but not least, setting up a firewall that opens the custom ssh port and ports 4505, 4506 which are used by salt.
Question
The article doesn't say anything about this, but shouldn't the same security measures be taken into consideration for the minions?
The bootstrap.sh script (that is used to hoist the minion(s)) doesn't seem to implement those settings (eg. running sudo salt 'minion01' cmd.run 'cat /etc/ssh/sshd_config' shows me port 22 is used and root access is permitted for the minion. Also sudo salt '*' cmd.run 'ufw verbose status' shows there's no firewall is installed
Related
Closed. This question is not about programming or software development. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 3 days ago.
Improve this question
I have tried iperf container https://github.com/iitggithub/iperf-web . I can able to configure the container and run the webpage successfully . But when i try to run the test on client its only connecting to server to server. not working as expected .
if some one tried the container let me know the process??
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Closed 7 years ago.
The community reviewed whether to reopen this question last year and left it closed:
Original close reason(s) were not resolved
Improve this question
I am running a Server hosting my iPhone messaging app. To control this Server, I have coded a python script which controls the Server through the Localhost IP and Port 80. Now, I wish to control the Server remotely, for which I have to create an access mechanism where either by hosting Python script on Cloud, which may be directly connected to the Messaging Server on my Computer, or let both Server and Script reside on my computer while I can access the script remotely.
I have temporarily hosted a FB app on pythonanywhere for free and it worked like a charm.
In case you plan to use Django, the version they offer usually lags behind.
In that case, I'd higly recommend using Openshift, by RedHat.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 8 years ago.
Improve this question
I'm basically looking for something like picloud.com (shut down last year) but that is able to run on my own cluster of servers.
I'd imagine some kind of server running on each of the grunt work servers and when a python code is run on my main server, it should send the job to a work server that is not "filled".
The load balancing should come at a later stage, right now what I need is a way to run local python code on the remote server I define.
Try RPyC.
From the site:
RPyC (pronounced as are-pie-see), or Remote Python Call, is a transparent python library for symmetrical remote procedure calls, clustering and distributed-computing. RPyC makes use of object-proxying, a technique that employs python’s dynamic nature, to overcome the physical boundaries between processes and computers, so that remote objects can be manipulated as if they were local.
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Closed 9 years ago.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
This question does not appear to be about a specific programming problem, a software algorithm, or software tools primarily used by programmers. If you believe the question would be on-topic on another Stack Exchange site, you can leave a comment to explain where the question may be able to be answered.
Improve this question
We have a server running 64-bit Ubuntu 12.04, which is serving a django application running under virtual environment, along with a celery worker.
The problem is no process in the server uses more than 2GB of memory. There are 8GB of memory available in the system with 4GB of swap. This graph shows that the process is rather limited by some factor
We did not do anything to enforce running processes in 32 bits or anything.
I should also note that both the django and celery are monitored by supervisor process manager and newrelic-client, I do not know if those have anything to do with the memory limit.
Why does the celery process not allocate more memory if it needs to?
System: Ubuntu 12.04 64 bit
Client: Django application (2 GB)
RAM: 8 GB
Swap: 4 GB
Closed. This question does not meet Stack Overflow guidelines. It is not currently accepting answers.
Questions asking us to recommend or find a tool, library or favorite off-site resource are off-topic for Stack Overflow as they tend to attract opinionated answers and spam. Instead, describe the problem and what has been done so far to solve it.
Closed 9 years ago.
Improve this question
I need to create a simple server/daemon which just waits for commands, (maybe queues them up), and executes other commands based on the received input. Here are more detailed requirements:
Should be in Python since I need to use some other Python code I wrote for it.
Needs to work on at least Linux and Windows (Will be running as an init.d service on Linux and
as a Windows Service on Windows)
Communication medium should be as simple as possible and hopefully commands can be sent to the server from a batch script
Commands come from the same machine. Performance isn't important.
Installation on Windows (including all the libraries needed to send a command to the server from a script) should be simple. If everything can be wrapped into a single .exe all the better.
What's the best stack to use for this? I have only a few vague ideas:
CherryPy (Windows doesn't have cURL though, ugh...)
dbus and windbus (never used these before...)
In my experience (and I've implemented three or four applications similar to what you've described), all things created equal, and I've found it simplest to go with a subclass of SocketServer.TCPServer and implement my own, simple, command system. This gives you full control over the details (need streaming? No problem; need stateful connections? No problem), and isn't all that tricky given some basic knowledge of how sockets work.
On Windows, I understand that py2exe does a good job.
A couple alternatives/tools you might want to consider:
SimpleXMLRPCServer and xmlrpclib — if you can live with the constraints imposed by HTTP + XMLRPC (stateless, streaming is tricky) this will get you up and running in about 5 minutes. Also note that the SimpleXMLRPCServer is single-threaded. It's possible to make it multi-threaded (or multi-process, or whatever), but that will take a little bit of work.
On Linux, python-daemon is a possibility for daemonizing, but I've usually found start-stop-daemon to be simpler (if it's available on your platform)
Is it possible to run a Python script as a service in Windows? If possible, how? should help you running the script as a service.