I want to work on a Windows machine with some privacy / intellectual property related python code within Pycharm.
I want to avoid that somebody else later on can see my code on the local file system (or can recover / undelete it from the SSD based file system). Therefore I am looking for a solution that keeps my python script(s) encrypted on the local file system but editable / usable within Pycharm.
I thought of creating a RAM disk or installing a virtual machine on the Windows computer. Unfortunately, I do not have admin rights on that computer so I cannot install any software nor create virtual machines.
Additionally I can use an USB stick in read only mode on that machine, but cannot write files back to that USB drive.
I am looking for a solution where I am still able to edit a Python script on that computer within Pycharm, but this file should not be persisted to the file system. Once I have finished my work, the Python script should be written back to the file system in an encrypted way.
Use an online Idle and save your code online.
If you really want pycharm I would upload the code and delete it from the pc and download again whenever I needed it.
You could work 'in the cloud' where you have an AWS instance that you checkout / clone /develop your work and you just use your computer essentially like a thin client for connecting to your remote instance.
Related
I am using a locked down system where I cannot install any applications including Anaconda or any other python.
Anybody knows if it is possible to access local files from a jupyter online solution? I know it would probably be slow as the file would have to be moved back and forth?
Thanks
Yes, you can use your local files from a Jupyter online solution doing what you say of moving them back and forth. (The remote server cannot connect to your local system itself beyond the browser sandbox, and so concerns like Chris mentions aren't an issue.)
I can demonstrate this easily:
Go to here and click on the launch binder badge you see.
A temporary session backed by MyBinder.org will spin up. Depending where you are in the world, you may be on a machine run by Jupyter folks via Google or another member of this service backed by folks in a Federation that believe this a valuable service to offer to empower Jupyter users.
After the session comes up, you'll be in JupyterLab interface. You'll see a file/directory navigation pane on the left side. You can click and drag a file on your local computer and drop it in that pane. You should see it show up on the remote directory.
You should be able to open and edit it. Depending on what it is or what you convert it to. You can even run it.
Of course you can make a new notebook on the remote session and save it. Then after saving it, download it back to your local machine by right-clicking on the icon for it in the file navigation pane and selecting 'Download'.
If you prefer to work in the classic Jupyter notebook interface, you can go to 'Help' and select 'Launch Classic Notebook' from the menu. The classic Jupyter Dashboard will come up. You will need to upload things to there using the upload button as drag and drop only works in JupyterLab. You can download back to your local computer from the dashboard or when you have a notebook open, you can use the file menu to download back to your local machine, too.
Make sure you save anything you make that is useful back to your machine as the sessions are temporary and will time out after 10 minutes of inactivity. They'll also disconnect after a few hours even if you are actively using them. There's a safety net built in that works if it does disconnect but you have to aware of it ahead of time. And it is best tested a few times in advance when you don't need it. See Getting your notebook after your Binder has stopped.
As this is going to a remote machine, obviously there are security concerns. Part of this is addressed by the temporary nature of the sessions. Nothing is stored remotely once the active session goes away. (Hence, the paragraph above because once it is gone, it is gone.) However, don't upload anything you wouldn't want someone else to see. Don't share keys and things with this system. In fact, it is possible now to do real time co-authoring/co-editing of Jupyter notebooks via the MyBinder system although some of the minor glitches are still being worked out.
A lot of packages you can install right in the session using %pip install or %conda install in cells right in the notebook. However, sometimes you want them already installed so the session is ready with the necessary software. (Plus some software won't work unless installed during the building backing image of the container backing the session.) That is where it becomes handy that you can customize the session that comes up via configuration files in public repositories. A list of places you can host those files is seen by going to MyBinder.org and pressing the dropdown menu in the top left side of that form there, under 'GitHub repository name or URL'. Here's an example. You can look in requirements.txt and see I install quite a few packages in the data science stack.
Of course there's other related online offerings for Jupyter (or you can install it on remote servers) and many use authentication. As some of those cost money and you are unsure about your locked system, the MyBinder.org system may help you test the limits of what you can do on your machine.
With this question I would like to gain some insights/verify that I'm on the right track with my thinking.
The request is as follows: I would like to create a database on a server. This database should be updated periodically by adding information that is present in a certain folder, on a different computer. Both the server and the computer will be within the same network (I may be running into some firewall issues).
So the method I am thinking of using is as follows. Create a tunnel between the two systems. I will run a script that periodically (hourly or daily) searches through the specified directory, convert the files to data and add it to the database. I am planning to use python, which I am fairly familiar with.
Note: I dont think I will be able to install python on the pc with the files.
Is this at all doable? Is my approach solid? Please let me know if additional information is required.
Create a tunnel between the two systems.
If you mean setup the firewall between the two machines to allow connection, then yeah. Just open the postgresql port. Check postgresql.conf for the port number in case it isn't the default. Also put the correct permissions in pg_hba.conf so the computer's ip can connect to it.
I will run a script that periodically (hourly or daily) searches through the specified directory, convert the files to data and add it to the database. I am planning to use python, which I am fairly familiar with.
Yeah, that's pretty standard. No problem.
Note: I dont think I will be able to install python on the pc with the files.
On Windows you can install anaconda for all users or just the current user. The latter doesn't require admin privileges, so that may help.
If you can't install python, then you can use some python tools to turn your python program into an executable that contains all the libraries, so you just have to drop that into a folder on the computer and execute it.
If you absolutely cannot install anything or execute any program, then you'll have to create a scheduled task to copy the data to a computer that has python over the network, and run the python script there, but that's extra complication.
If the source computer is automatically backed up to a server, you can also use the backup as a data source, but there will be a delay depending on how often it runs.
I have been working with PyCharm for quite some time now and I recently upgraded my storing system with a NAS.
Everything is working fine except one : PyCharm scans through my files to reindex them very very often. This makes me losing a lot of time waiting for it to end.
When the reindexing occurs:
When a script ends
When a debugging session ends
When PyCharm loses the focus, i.e. I use another application
So it happens basically ALL the time, taking quite a long time (several minutes sometimes).
Misc.:
Windows 10
PyCharm Community Edition 2018.1
Netgear - ReadyNas 422
Do you have any ideas to solve this issues ?
So I have contacted the IntelliJ support and here is their response:
Working with network drives/folders is not supported officially yet.
Using remote development features is recommended (remote interpreter,
deployment etc). Here is more detailed answer
https://intellij-support.jetbrains.com/hc/en-us/community/posts/207069145/comments/207464249.
What I end up doing, which is really not ideal, is to create a local copy of my projects environment and syncing it with a folder in my NAS. To do so I used the SyncBackPro software.
I'm using PyCharm both at home and at work with code stored on a Samba share (using its remote interpreter feature). I don't encounter consistent reindexing but by default it does not support file system notifications to know when a file changed.
However, as a programmer this shouldn't discourage you! You can drop in your own file system notifier that connects to your remote system (assuming your NAS runs Linux and supports SSH) and thus avoid the performance drop.
I actually wrote such a proxy to run the fsnotifier on a remote system a few years ago and I'm still using it. If you are interested, check out https://github.com/ThiefMaster/fsnotifier-remote
Some things in the repo are outdated (JetBrains removed this stupid file size check for example), but it should still provide you a good basis to start from if you are interested in using it.
I need to read a csv file, which is saved in my local computer, from code within an "Execute R/Python Script" in an experiment of Azure Machine Learning Studio. I don't have to upload the data as usually, i.e. from Datasets -> New -> Load from local file or with an Import Data module. I must do it with code. In principle this is not possible, neither from an experiment nor from a notebook, and in fact I always got error. But I'm confused because the documentation about Execute Python Script module says (among other things):
Limitations
The Execute Python Script currently has the following limitations:
Sandboxed execution. The Python runtime is currently sandboxed and, as a result, does not allow access to the network or to the local file system in a persistent manner. All files saved locally are isolated and deleted once the module finishes. The Python code cannot access most directories on the machine it runs on, the exception being the current directory and its subdirectories.
According to the highlighted text, it should be possible to access and load a file from current directory, using for instance the pandas function read_csv. But actually no. There is some trick to accomplish this?
Thanks.
You need to remember that Azure ML Studio is an online tool, and it's not running any code on your local machine.
All the work is being done in the cloud, including running the Execute Python Script, and this is what the text you've highlighted refers to: the directories and subdirectories of the cloud machine running your machine learning experiment, and not your own, local, computer.
Is it possible to open files from a Raspberry pi in windows for editing (using for example notepad++)?
I am currently using the built in python IDE in Raspbian but i feel that it would speed up the development process if i could use a windows IDE for development. I have also tried using a git repo to share files between the PI and Windows but it is a bit cumbersome to.
Or does anyone have any other ideas about workflow between Windows and Raspberry?
You can run a SAMBA server on your Raspberry Pi, set your python project folder as a network disk. Then you can use any windows IDE you like, just open the file which is on the network disk.
Currently I am using VS2015 + Python Tools for Visual Studio for remote debugging purpose.
Sure. I go through many ways and I found one of the best way is using WinSCP.
It's very easy for you to edit and update file with notepad++ right in the Windows.
Why not just set up a VM on your windows machine with rasbian running? Something like this will get you started: http://www.makeuseof.com/tag/emulate-raspberry-pi-pc/
Otherwise - set up a network share between the two, edit files on your windows computer, & run from the pi.