How to set up full access to the shared folders in Linux? - python

everyone.
I run Linux python scripts under Linux virtual machine. There are a few shared folders between host and guest system. Script works with files in folder mounted to guest file system through VirtualBox Tools. I have changed access mode to all files and directories in this folder. Other programs (e.g. MatLab) have full access (create or delete any file) to shared file system if I run it under super user. Python returns this error when I run shutil.rmtree(path):
OSError: [Errno 26] Text file busy
How can I share my folders without the similar problems?
Details:
Guest - Linux Ubuntu 18.04
Host - Windows 10
VirtualBox version 6.1.6 r137129
Python 3.6

I get this problem as well with shared directories. To my knowledge, there is only one solution to this: don't work with executable files in shared directories.
My understanding of the problem is that the guest operating system is trying to run your command on that file/directory while the host operating system is currently doing something with it via VirtualBox. I don't know the specifics on what exactly VirtualBox is doing in this context, but I suspect it has something to do with synchronizing the content of the files back to the host.
Probably not the answer you were hoping for, but virtual machines are meant to be entirely self-contained, so using shared directories should probably be avoided. If your code is version controlled using something like Git, try cloning the repository into the virtual machine instead.

Related

Airflow doesnt see my local file: ' FileNotFoundError: [Errno 2] No such file or directory: ' [duplicate]

I'm trying create a container to run a program. I'm using a pre configurate image and now I need run the program. However, it's a machine learning program and I need a dataset from my computer to run.
The file is too large to be copied to the container. It would be best if the program running in the container searched the dataset in a local directory of my computer, but I don't know how I can do this.
Is there any way to do this reference with some docker command? Or using Dockerfile?
Yes, you can do this. What you are describing is a bind mount. See https://docs.docker.com/storage/bind-mounts/ for documentation on the subject.
For example, if I want to mount a folder from my home directory into /mnt/mydata in a container, I can do:
docker run -v /Users/andy/mydata:/mnt/mydata myimage
Now, /mnt/mydata inside the container will have access to /Users/andy/mydata on my host.
Keep in mind, if you are using Docker for Mac or Docker for Windows there are specific directories on the host that are allowed by default:
If you are using Docker Machine on Mac or Windows, your Docker Engine daemon has only limited access to your macOS or Windows filesystem. Docker Machine tries to auto-share your /Users (macOS) or C:\Users (Windows) directory. So, you can mount files or directories on macOS using.
Update July 2019:
I've updated the documentation link and naming to be correct. These type of mounts are called "bind mounts". The snippet about Docker for Mac or Windows no longer appears in the documentation but it should still apply. I'm not sure why they removed it (my Docker for Mac still has an explicit list of allowed mounting paths on the host).

Visual Studio Code use on remote files

Is there any way to use the python extension to edit files that reside on a remote server? I have tried NFS and remoteFS, but I do not see any way to get Intellisense working using the remote installation. I normally edit and test on a windows machine, while the target runs on Linux.
I realise this is not limited to this extension, but is a more general issue.
Visual Studio Code now officially supports this using an Extension: Remote SSH
Read the release notes here: https://code.visualstudio.com/blogs/2019/05/02/remote-development
Today we're excited to announce the preview of three new extensions for Visual Studio Code that enable seamless development in Containers, remotely on physical or virtual machines, and with the Windows Subsystem for Linux (WSL). You can get started right away by installing the Remote Development Extension Pack.
As a workaround, I'm using a Linux Hosted virtual machine which has a similar setup as the target. This works surprisingly well. It is a shame VMware 12 removed support for unity.
I use SSHFS (wikipedia) (github repo)
sshfs OWN_USER#SERVER:/PATH_TO_FILES/ MOUNT_POINT
This makes the remote files visible to any program on your computer, as-if they were local files, through a virtual "FUSE" filesystem.
If your own user can't access the files (you need root or some other user), you can sudo like so:
sshfs -o sftp_server="sudo -u SYSTEM_USER /usr/libexec/openssh/sftp-server" \
OWN_USER#SERVER:/PATH_TO_FILES/ MOUNT_POINT
You can install sshfs for Linux, Mac, or Windows, check out Digital Ocean's guide in my first link.
Don't forget to umount, fusermount -u, or eject that MOUNT_POINT once you're done.
I don't know if other VS Code plugins like IntelliSense would work with this. They should because the sshfs makes the files visible just like any others. But, it would require that the python tool chain you have installed locally be the same on your laptop and on the server. It'd be interesting to find out.
Or, Microsoft just announced some new plugins on the way
Yes there are some. I used this one. It allows to synchronize code between local and remote server.
You will have to keep copy on local host and it can be configured to automatically update code on remote.
https://gurumantra.themillennialpost.info/2020/05/edit-linux-files-remotely-in-vscode.html
Download and install vscode in your localPc if you don’t have it. (click here to download vscode)
Summary :
Install Vscode, remote vscode – LocalPC
Install ssh and Rmate – RemotePc
Ready to access files/data
Detailed Steps:
https://gurumantra.themillennialpost.info/2020/05/edit-linux-files-remotely-in-vscode.html

Remote Python Environment for Sublime

For various purposes one might need to hook up a remote python interpreter.
How can I access anaconda/bin from an sshfs mounted folder, all tries yielded access denied since the interpreter and python file are on different machines.
There is a solution of hosting a server on the remote machine to access the remote conda install, this is not an option.
Alternatively one could install the same environment locally, how would one keep them in sync?
The main idea is to get anaconda sublime functionality on the local machine. Running/managing a python project can easily be done through ssh.
Edit: I am aware of the various remote/on_server editing options, (vim, emacs ...) but we have to stick to sublime.
Note: I do not have sudo privileges on this server.

How to install django applications on a Memory Stick

I am currently developing an open source software based on python/django. The software should later be easy installable by a standard windows/linux users without any programming experiance. It should also be portable to different computers. The only installation that should be required on these computers should be python itself.
Is there a way to get this to work?
I already found this "dbuilder" Django Projects as Desktop applications : how to?
desktop-applications-how-to
It seems to be a bit outdated and not a very smooth solution.
Are there better solutions?
Just use a portable version of python on your memory stick. Make a batch file that runs
projname.bat file:
python.exe /django-app-path/manage.py runserver
now open a browser and browse for it
the default address will be:
http://127.0.0.1:8000
If you need to browse your app on other device that you're app is running:
get your server ip with
windows shell>ipconfig
linux shell# ifconfig
then run your development server on that address (in the batch file):
python.exe /django-app-path/manage.py runserver your-ip-address:port-if-not-80

How to watch a nfs-mounted directory for newly created or changed files under Linux with Python?

There is a directory on a server I have now access to except for nfs-mounts. I mount the directory via nfs into a local linux system. New files arrive in the directory and some older files may get updated via other processes on the server.
I'd like to write a Python script that kicks into action whenever such a file is created or changed. I know that it is possible to watch a local directory with Linux and Python using inotify (or dnotify in older Versions). However these do not seem to work for remotely mounted volumes.
What are my options or is there a solution already implemented?
You could try FAM.
FAM can provide an RPC service for monitoring remote files (like a mounted NFS file system).

Categories

Resources