I am running Win7 in VirtualBox VM and my goal is listing the list of files that are inside the Win7 VM from outside the VM, for example, I want to use python client. I have network access to the VM, is the best practice is sharing all the files and folders using Samba and accessing through the network with python client? Any more suggestions? I want also to be able to download the files. (The client will run on OSX/Linux)
You can use WinSCP - https://winscp.net/eng/download.php
This will help you to access the files with a nice GUI. Make sure you select the commander option while installing WinSCP. This will allow you to have two pane - one for your host and one for your VM.
If you are planning to make the files downloadable for a private network users, then you can install Xampp server inside your VM and place the files to be downloaded inside "C:\xampp\htdocs\dashboard\" and share the URL (e.g. 192.168.10.2:5000\dashboard" with the users inside same network. So that they can download the required files.
Related
I need to get a complete list of files in a folder and all its subfolders regularly (daily or weekly) to check for changes. The folder is located on a server that I access as a network share.
This folder currently contains about 250,000 subfolders and will continue to grow in the future.
I do not have any access to the server other than the ability to mount the filesystem R/W.
The way I currently retrieve the list of files is by using python's os.walk() function recursively on the folder. This is limited by the latency of the internet connection and currently takes about 4.5h to complete.
A faster way to do this would be to create a file server-side containing the whole list of files, then transfering this file to my computer.
Is there a way to request such a recursive listing of the files from the client side?
A python solution would be perfect, but I am open to other solutions as well.
My script is currently run on Windows, but will probably move to a Linux server in the future; an OS-agnostic solution would be best.
You have provided the answer to your question:
I do not have any access to the server other than the ability to mount the filesystem R/W.
Nothing has to be added after that, since any server side processing requires the ability to (directly or indirectly) launch a process on the server.
If you can collaborate with the server admins, you could ask them to periodically start a server side script that would build a compressed archive (for example a zip file) containing the files you need, and move it in a specific location when done. Then you would only download that compressed archive saving a lot of network bandwidth.
You can approach this in multiple ways. I would do this by doing a running a script over ssh like
ssh xys#server 'bash -s' < local_script_togetfilenames.sh
If you prefer python you can run a similar python script by adding #!python assuming python is installed on the server
If you want to stick to fully python you should explore python RPC(Remote process call)
You can use rPyC library . Documentation is
here
Forgive me, I'm new to all this. It might not even be possible?
I have a Dash app that does a number of calculations, and I need to deploy it locally somehow.
I need all users in our company to be able to view it, but without the dependencies of the packages. I cannot use any web-based (Heroku, Git, etc) method as the data is commercially sensitive and must remain site-only.
I can successfully run it through waitress-serve on my machine and it can be viewed on other computers, but I'd rather it run from the server and be accessible by anyone that wants to use it.
What's the solution? Is is possible to have a folder on the server that has all the associated files and dependencies, and then a batch file (or similar - that's what I use now to launch mine) to launch the app on a wsgi server? Would our network have to have the python dependencies installed however?
I was wondering if JupyterLab has an API that allows me to programmatically upload files from my local storage to the JupyterLab portal. Currently, I am able to manually select "Upload" through the UI, but I want to automate this.
I have searched their documentation but no luck. Any help would be appreciated. Also, I am using a chromebook (if that matters).
Thanks!!
Firstly, you can use python packages "requests" and "urllib" to upload files
https://stackoverflow.com/a/41915132/11845699
This method is actually the same as clicking the upload button, but the uploading speed is not very satisfying so I don't recommend it if you are uploading lots of files or some large files.
I don't know whether your JupyterLab server is managed by your administrator or yourself. In my case, I'm the administrator of the server in my lab. So I setup an NFS disk and mount it to a folder in the JupyterLab working directory. The users can access to this NFS disk via our local network or the internet. NFS disk is capable of transmitting lots of large files, which is much more efficient than the Jupyter upload button. I learned this from a speech of a TA in Berkeley https://bids.berkeley.edu/resources/videos/teaching-ipythonjupyter-notebooks-and-jupyterhub
I highly recommend this if you can contact the person who has access to the file system of your Jupyter server. If you don't use Linux, then Webdav is an alternative to NFS. Actually, anything that can give you access to a folder on a remote server is optional, such as Nextcloud or Pydio.
(If you can't ask the administrator to deploy such service, then just use the python packages)
I'm getting started working with Docker. I installed Docker Toolbox on Windows 10 and downloaded the desired container. I need full access to container’s filesystem with the ability to add and edit files. Can I transfer the contents of the container into a Virtual Python Environment in Windows filesystem? How to do it?
Transferring files between Windows and Linux might be a little annoying because of different line endings.
Putting that aside, sounds like you are looking to create a Docker based development environment. There are good tutorials online that walk you through setting one up, I would start with one of these
Running a Rails Development Environment in Docker. This one is about Rails, but the principles will be the same. Section 3 specifically talks about about sharing code between your host machine and the Docker container.
How To Work with Docker Data Volumes on Ubuntu 14.04 includes an brief introduction to Docker containers, different use cases for data volumes, and how to get each one working. Sharing Data Between the Host and the Docker Container section talks about what you are trying to do. This example talks about reading log files created inside the container, but the principle is the same for adding/updating files in the container.
I have designed a small, simple and static website offline in odoo 8. It is on my computer: Localhost.
I want to transfer these files to domain server which is on Windows and database in MySQL.
Which files I have to pick up from my PC to transfer to web server?
What is the path for these files? (Right now I can see files in C:\Program Files (x86)\Odoo 8.0-20150526. But I am not able to find website module files e.g. images or index etc.).
Can you help me by steps how and which files to pick up and upload through FTP like Filezilla?
Will Python site work on Windows Server?
Its pretty simple you dont need to copy python any files, thats not right way in odoo, right way is as following.
First Deploy the same version of Odoo from nightly to your domain and then take back-up of your local instnace with "Zip (inclues filestore)" option. and then restore that copy on your domain instance.
Guide to How to Backup and Restore
Bests