Save excel file to local storage from Jupyter using python - python

I am working with Pandas and I have plenty of Excel files which I need to store in my local storage directly from Jupyter Notebook.
For example,
I create an excel file named "test.xlsx".
Now I want to save this excel file into somewhere in my C:/ drive.
Please help me out with the coding part in python.

Related

Download colab notebook from itself

Is there a way to download the google colab notebook from one of its cells?
The issue appears when co-working on the same notebook and one accidentally overwrites the output of another. I want to save the ipynb file automatically after running my cells.
I'm aware of the way to download a file that was created by the notebook:
from google.colab import files
files.download('example_file.csv')
But I couldn't locate where the notebook's ipynb is placed in the filesystem outside the content directory.

pyexcel : Save in downloads folder the xls file exported

I'm new with pyexcel, i'm programing in django and python and i export an array with the folowing code
pyexcel.isave_as(array=array_of_data, dest_file_name=('%(name)s.xls') % {'name':model.name})
my problem is the file is exported and save it but, i dont want to save it in the program folder, i want to select that the .xls is downloaded in the downloads folder

How can I read a gdoc file in Google Colab?

I'm trying to read a bunch of Google Docs files into Google collab to work with some text data.
It can't seem to read in the '.gdoc' file format, only the .txt file format.
Do I have to save all of them as .txt files first? Is there an efficient way to do this in python? Or is it possible to work with .gdoc files?
Thanks for any help!
Hi I have been stuck in same problem then the following worked for me.
Go to Drive folder where all gdocs are present.
Now simply right click on it and download whole folder.
Google Drive automatically convert all gdocs to docx during that operation.
Upload it on colab or use them locally.

Read CSV from another folder in Jupiter Notebook (python)

I have my CSV files in a Data Folder in Jupyter Notebook, then I have a few other notebooks that I want to rely on the CSV files in the Data Folder. How do I read in the CSV files From the Data Folder into the notebooks that I want the analysis to occur in? I'm using a Mac and doing this in Python.
You need to specify the path to the csv file you want to put in pd.read_csv. For example, if your jupyter notebook is located on your Desktop, and the csv file (lets call it "my_project_1") is somewhere else, say in a folder called Projects, and that folder is in your Documents folder, then you specify the path as following:
import pandas as pd
df = pd.read_csv('/Users/your_user_name/Documents/Projects/my_project_1.csv')
Don't forget to change your_user_name to your actual user name.

How to save a dataframe as csv to github repository python

I have a dataframe named "domains". I want to save it as csv to my github project. How do I do that?
Many thanks!
You can use pandas to save the data frame as a csv file in the local folder that your GitHub repo is attached to;
domains.to_csv("path_to_local_git_folder/domains.csv")
More info about this function is on the pandas website
Then once you have your csv file locally, you can add, commit and push to GitHub just like you would a python script.

Categories

Resources