I want to run the following code. My goal is to produce a .csv dataframe from a netcdf file in which I have isolated variables. I also copied the print(my_file) below for you to see the structure of my netcdf file. The error occurs when I run pd.Series as shown below :
temp_ts = pd.Series(temp, index=dtime)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/gpfs/apps/miniconda3/lib/python3.7/site-packages/pandas/core/series.py", line 305, in __init__
data = sanitize_array(data, index, dtype, copy, raise_cast_failure=True)
File "/gpfs/apps/miniconda3/lib/python3.7/site-packages/pandas/core/construction.py", line 482, in sanitize_array
raise Exception("Data must be 1-dimensional")
Exception: Data must be 1-dimensional
Below, all of my code :
#netcdf to .csv
import netCDF4
import pandas as pd
import numpy as np
temp_nc_file = '/gpfs/home/UDCPP/barrier_c/Test_NCO/temp_Corse_10m_201704.nc'
nc = netCDF4.Dataset(temp_nc_file, mode='r')
nc.variables.keys()
lat = nc.variables['latitude'][:]
lon = nc.variables['longitude'][:]
time_var = nc.variables['time']
dtime = netCDF4.num2date(time_var[:],time_var.units)
temp = nc.variables['TEMP'][:]
# a pandas.Series designed for time series of a 2D lat,lon grid
temp_ts = pd.Series(temp, index=dtime)
temp_ts.to_csv('temp.csv',index=True, header=True)
And finally the print(my_file) result if useful :
print(nc)
<class 'netCDF4._netCDF4.Dataset'>
root group (NETCDF4_CLASSIC data model, file format HDF5):
limi: 0
lima: 1100
pasi: 1
ljmi: 0
ljma: 462
pasj: 1
lkmi: 1
lkma: 60
pask: 1
global_imin: 0
global_imax: 1100
global_jmin: 0
global_jmax: 462
data_type: OCO oriented grid
format_version: 1.3.1
Conventions: CF-1.6 OCO-1.3.1 COMODO-1.0
netcdf_version: 4.1.2
product_version: 1.0
references: http://www.previmer.org/
easting: longitude
northing: latitude
grid_projection: n/a
distribution_statement: Data restrictions: for registered users only
institution: IFREMER
institution_references: http://www.ifremer.fr/
data_centre: IFREMER OCO DATA CENTER
data_centre_references: http://www.previmer.org/
title: PREVIMER MENOR 1200 forecast
creation_date: 2017-01-03T20:21:27Z
run_time: 2017-01-03T20:21:27Z
history: Tue Jun 9 18:24:38 2020: nces -O -d level,10,10 temp_Corse_201704.nc temp_Corse_10m_201704.nc
Tue Jun 9 15:41:33 2020: nces -O -d ni,565,700 -d nj,150,350 temp_201704.nc temp_Corse_201704.nc
Tue Jun 9 15:32:46 2020: ncks -O -v TEMP champs_meno_BE201704.nc temp_201704.nc
Mon Nov 6 10:27:14 2017: /appli/nco/nco-4.6.4__gcc-6.3.0/bin/ncrcat /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T2100Z.nc /home1/datawork//scoudray/DATARESU/MENOR/2017BE/champs_meno_BE201701.nc
2017-01-03T20:21:27Z: creation
model_name: MARS
arakawa_grid_type: C1
source: MARS3D V10.10
area: North Western Mediterranean Sea
southernmost_latitude: 39.5000
northernmost_latitude: 44.5000
latitude_resolution: 1.082250000000000E-002
westernmost_longitude: 0.0000
easternmost_longitude: 15.9999
longitude_resolution: 1.454540000000000E-002
minimum_depth: 5.000000
maximum_depth: 3500.000
depth_resolution: n/a
forecast_range: 4-days forecast
forecast_type: forecast
operational_status: experimental
NCO: netCDF Operators version 4.9.3 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)
start_date: 2017-01-01T00:00:00Z
stop_date: 2017-01-01T00:00:00Z
software_version: PREVIMER forecasting system v2
product_name: PREVIMER_F2-MARS3D-MENOR1200_20170101T0000Z.nc
field_type: 3-hourly
comment: Use of Meteo-France ARPEGEHR meteorological data
contact: cdoco-exploit#ifremer.fr
quality_index: 0
nco_openmp_thread_number: 1
dimensions(sizes): nj(201), ni(136), time(248), level(1)
variables(dimensions): float32 H0(nj,ni), float32 TEMP(time,level,nj,ni), float32 XE(time,nj,ni), float32 b(), float32 hc(nj,ni), float64 latitude(nj,ni), float32 level(level), float64 longitude(nj,ni), float32 ni(ni), float32 nj(nj), float32 theta(), float64 time(time)
groups:
THANK YOU IN ADVANCE FOR YOUR HELP !
I'm not sure if this is what you are looking for but you could try this.
import xarray as xr
file = xr.open_dataset('/gpfs/home/UDCPP/barrier_c/Test_NCO/temp_Corse_10m_201704.nc')
temp_ts = file['TEMP'].to_series()
temp_ts.to_csv('temp.csv',index=True, header=True)
Related
I'm trying to plot a heatmap using this code:
import folium
from folium.plugins import HeatMap
max_Count = (dataM['count'].max())
hmap = folium.Map(location=[53.192838, 8.197006], zoom_start=7,)
hm_wide = HeatMap( list(zip(dataM.latitude.values, dataM.longitude.values, dataM.count.values)),
min_opacity=0.2,
max_val=max_Count,
radius=17, blur=15,
max_zoom=1,
)
hmap.add_child(hm_wide)
the dataframe looks like that:
station count latitude longitude city
Time
2021-05-01 00:00:00 02-MI-JAN-N 11.0 52.5139 13.41780 Berlin
2021-05-01 00:00:00 24-MH-ALB 0.0 52.4925 13.55850 Berlin
2021-05-01 00:00:00 23-TK-KAI 1.0 52.4573 13.51870 Berlin
... ... ... ... ... ...
2021-09-09 23:45:00 50801_Amalienstr 0.0 53.1390 8.22225 Oldenburg
but i'm getting this error:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-23-c1b7a410c325> in <module>
5 hmap = folium.Map(location=[53.192838, 8.197006], zoom_start=7,)
6
----> 7 hm_wide = HeatMap( list(zip(dataM.latitude.values, dataM.longitude.values, dataM.count.values)),
8 min_opacity=0.2,
9 max_val=max_Count,
AttributeError: 'function' object has no attribute 'values'
Any idea about the reason behind it and how can it be solved?
Thank you
UPDATE:
I've used dataM['latitude'], dataM['longitude'], dataM['count'] and it works :))
import folium
from folium.plugins import HeatMap
max_Count = (dataM['count'].max())
hmap = folium.Map(location=[53.192838, 8.197006], zoom_start=7,)
hm_wide = HeatMap( list(zip(dataM['latitude'], dataM['longitude'], dataM['count'])),
min_opacity=0.2,
max_val=max_Count,
radius=17, blur=15,
max_zoom=1,
)
hmap.add_child(hm_wide)
The data in test.csv are like this:
TIMESTAMP POLYLINE
0 1408039037 [[-8.585676,41.148522],[-8.585712,41.148639],[...
1 1408038611 [[-8.610876,41.14557],[-8.610858,41.145579],[-...
2 1408038568 [[-8.585739,41.148558],[-8.58573,41.148828],[-...
3 1408039090 [[-8.613963,41.141169],[-8.614125,41.141124],[...
4 1408039177 [[-8.619903,41.148036],[-8.619894,41.148036]]
.. ... ...
315 1419171485 [[-8.570196,41.159484],[-8.570187,41.158962],[...
316 1419170802 [[-8.613873,41.141232],[-8.613882,41.141241],[...
317 1419172121 [[-8.6481,41.152536],[-8.647461,41.15241],[-8....
318 1419171980 [[-8.571699,41.156073],[-8.570583,41.155929],[...
319 1419171420 [[-8.574561,41.180184],[-8.572248,41.17995],[-...
[320 rows x 2 columns]
I read them from csv file in this way:
train = pd.read_csv("path/train.csv",engine='python',error_bad_lines=False)
So, I have this timestamp in Unix format. I want to convert in UTC time and then extract year, month, day and so on.
This is the code for the conversion from Unix timestamp to UTC date time:
train["TIMESTAMP"] = [float(time) for time in train["TIMESTAMP"]]
train["data_time"] = [datetime.datetime.fromtimestamp(time, datetime.timezone.utc) for time in train["TIMESTAMP"]]
To extract year and other information I do this:
train["year"] = train["data_time"].dt.year
train["month"] = train["data_time"].dt.month
train["day"] = train["data_time"].dt.day
train["hour"] = train["data_time"].dt.hour
train["min"] = train["data_time"].dt.minute
But I obtain this error when the execution arrives at the extraction point:
---------------------------------------------------------------------------
AttributeError Traceback (most recent call last)
<ipython-input-30-d2249cabe965> in <module>()
67 train["TIMESTAMP"] = [float(time) for time in train["TIMESTAMP"]]
68 train["data_time"] = [datetime.datetime.fromtimestamp(time, datetime.timezone.utc) for time in train["TIMESTAMP"]]
---> 69 train["year"] = train["data_time"].dt.year
70 train["month"] = train["data_time"].dt.month
71 train["day"] = train["data_time"].dt.day
2 frames
/usr/local/lib/python3.7/dist-packages/pandas/core/indexes/accessors.py in __new__(cls, data)
478 return PeriodProperties(data, orig)
479
--> 480 raise AttributeError("Can only use .dt accessor with datetimelike values")
AttributeError: Can only use .dt accessor with datetimelike values
I also read a lot of similiar discussion but I can't figure out why I obtain this error.
Edited:
So the train["TIMESTAMP"] data are like this:
1408039037
1408038611
1408039090
Then I do this with this data:
train["TIMESTAMP"] = [float(time) for time in train["TIMESTAMP"]]
train["data_time"] = [datetime.datetime.fromtimestamp(time, datetime.timezone.utc) for time in train["TIMESTAMP"]]
train["year"] = train["data_time"].dt.year
train["month"] = train["data_time"].dt.month
train["day"] = train["data_time"].dt.day
train["hour"] = train["data_time"].dt.hour
train["min"] = train["data_time"].dt.minute
train = train[["year", "month", "day", "hour","min"]]
i am trying to handle data in python using pandas , I have this data
import folium
import pandas
mapp = folium.Map(location=[19.997454,73.789803], zoom_start=6, tiles="Stamen Terrain" )
fg = folium.FeatureGroup(name="my map")
df=pandas.read_csv("volcanoes.txt")
cordinates="[" + df["LAT"].astype(str) + "," + df["LON"].astype(str) +"]"
for i in cordinates:
fg.add_child(folium.Marker(location=i,popup="hey jayesh , welcome to Nashik",icon=folium.Icon(color="green")))
mapp.add_child(fg)
mapp.save("jay1.html")
> Windows PowerShell Copyright (C) Microsoft Corporation. All rights
> reserved.
>
> Try the new cross-platform PowerShell https://aka.ms/pscore6
>
> PS C:\Users\DELL\OneDrive\Desktop\python\volcano> &
> C:/Users/DELL/AppData/Local/Programs/Python/Python39/python.exe
> c:/Users/DELL/OneDrive/Desktop/python/volcano/jayesh.py Traceback
> (most recent call last): File
> "c:\Users\DELL\OneDrive\Desktop\python\volcano\jayesh.py", line 10, in
> <module>
> fg.add_child(folium.Marker(location=i,popup="hey jayesh , welcome to Nashik",icon=folium.Icon(color="green"))) File
> "C:\Users\DELL\AppData\Local\Programs\Python\Python39\lib\site-packages\folium\map.py",
> line 277, in __init__
> self.location = validate_location(location) if location else None File
> "C:\Users\DELL\AppData\Local\Programs\Python\Python39\lib\site-packages\folium\utilities.py",
> line 50, in validate_location
> raise ValueError('Expected two (lat, lon) values for location, ' ValueError: Expected two (lat, lon) values for location, instead got:
> '[48.7767982,-121.810997]'. PS
> C:\Users\DELL\OneDrive\Desktop\python\volcano>
The problem is with this line:
cordinates="[" + df["LAT"].astype(str) + "," + df["LON"].astype(str) +"]"
You are generating a string literal and passing that in.
Try replacing that line with:
cordinates = [(lat, lon) for lat, lon in zip(df["LAT"],df["LON"])]
This will generate a list of (lat, lon) tuples, which should work. I also don't think you need to cast them to str
**it work for me like his**
import folium
import pandas
mapp = folium.Map(location=[19.997454,73.789803], zoom_start=6, tiles="Stamen Terrain" )
fg = folium.FeatureGroup(name="my map")
df=pandas.read_csv("volcanoes.txt")
lat=list(df["LAT"])
lon=list(df["LON"])
for i,j in zip(lat,lon):
fg.add_child(folium.Marker(location=[i,j],popup="volcanoes",icon=folium.Icon(color="green")))
mapp.add_child(fg)
mapp.save("volcanoes.html")
I am trying to create a web map which shows the locations of volcanoes. I am using the Folium library in Python, and designing it with HTML. I am also using data from a Pandas DataFrame. However, when I run my code, I get the following error:
Traceback (most recent call last):
File "webmap.py", line 20, in <module>
iframe = folium.IFrame(html=html % (name, name, str(elev)), width=200, height=100)
TypeError: not all arguments converted during string formatting
This is my code:
import folium
import pandas
data = pandas.read_csv("Volcanoes.txt")
latitudes = list(data["LAT"])
longitudes = list(data["LON"])
elevations = list(data["ELEV"])
names = list(data["NAME"])
html = """
Volcano name:<br>
%s<br>
Height: %s m
"""
map = folium.Map(location=[38.58, -99.09], zoom_start=6, tiles="Stamen Terrain")
fg = folium.FeatureGroup(name="My Map")
for lat, lon, elev, name in zip(latitudes, longitudes, elevations, names):
iframe = folium.IFrame(html=html %(name, name, str(elev)), width=200, height=100)
fg.add_child(folium.Marker(location=[lat, lon], popup=folium.Popup(iframe), icon=folium.Icon(color="green")))
map.add_child(fg)
map.save("Map1Advacned.html")
The Pandas DataFrame contains information about each volcano, including its location (latitude and longitude), elevation, and name, which I parsed into a Python array in the first bit of my code.
Does anyone know why this error occurs? Any help would be much appreciated. Thanks in advance!
My supplementary data is Singapore government weather station temperature data. I've manipulated it a to fit into your example
changed location and zoom params as Singapore is in a different place and much smaller ;-)
your core issue is the string substitution into html variable. I much prefer f-strings so have changed it to this and it works.
import folium
import pandas as pd
df = pd.DataFrame({'latitude': [1.3764, 1.256, 1.3337, 1.3135, 1.3399, 1.2799],
'longitude': [103.8492, 103.679, 103.7768, 103.9625, 103.8878, 103.8703],
'value': [32.3, 31.7, 32.2, 29.9, 32.1, 32.5],
'tooltip': ['32.3 Ang Mo Kio Avenue 5 August 09, 2020 at 01:00PM',
'31.7 Banyan Road August 09, 2020 at 01:00PM',
'32.2 Clementi Road August 09, 2020 at 01:00PM',
'29.9 East Coast Parkway August 09, 2020 at 01:00PM',
'32.1 Kim Chuan Road August 09, 2020 at 01:00PM',
'32.5 Marina Gardens Drive August 09, 2020 at 01:00PM']})
data = df.copy().rename({'latitude':"LAT",'longitude':"LON",'value':"ELEV",'tooltip':"NAME"}, axis=1)
latitudes = list(data["LAT"])
longitudes = list(data["LON"])
elevations = list(data["ELEV"])
names = list(data["NAME"])
def myhtml(name, elev):
return f"""
Volcano name:<br>
{name}<br>
Height: {elev} m
"""
map = folium.Map(location=[1.34, 103.82], zoom_start=12, tiles="Stamen Terrain")
fg = folium.FeatureGroup(name="My Map")
for lat, lon, elev, name in zip(latitudes, longitudes, elevations, names):
iframe = folium.IFrame(html=myhtml(name, elev), width=200, height=100)
fg.add_child(folium.Marker(location=[lat, lon], popup=folium.Popup(iframe), icon=folium.Icon(color="green")))
map.add_child(fg)
map.save("Map1Advacned.html")
I have spent almost 2 days scrolling the internet and I was unable to sort out this problem. I am trying to install the graphframes package (Version: 0.2.0-spark2.0-s_2.11) to run with spark through PyCharm, but, despite my best efforts, it's been impossible.
I have tried almost everything. Please, know that I have checked this site here as well before posting an answer.
Here is the code I am trying to run:
# IMPORT OTHER LIBS --------------------------------------------------------
import os
import sys
import pandas as pd
# IMPORT SPARK ------------------------------------------------------------------------------------#
# Path to Spark source folder
USER_FILE_PATH = "/Users/<username>"
SPARK_PATH = "/PycharmProjects/GenesAssociation"
SPARK_FILE = "/spark-2.0.0-bin-hadoop2.7"
SPARK_HOME = USER_FILE_PATH + SPARK_PATH + SPARK_FILE
os.environ['SPARK_HOME'] = SPARK_HOME
# Append pySpark to Python Path
sys.path.append(SPARK_HOME + "/python")
sys.path.append(SPARK_HOME + "/python" + "/lib/py4j-0.10.1-src.zip")
try:
from pyspark import SparkContext
from pyspark import SparkConf
from pyspark.sql import SQLContext
from pyspark.graphframes import GraphFrame
except ImportError as ex:
print "Can not import Spark Modules", ex
sys.exit(1)
# GLOBAL VARIABLES --------------------------------------------------------- -----------------------#
SC = SparkContext('local')
SQL_CONTEXT = SQLContext(SC)
# MAIN CODE ---------------------------------------------------------------------------------------#
if __name__ == "__main__":
# Main Path to CSV files
DATA_PATH = '/PycharmProjects/GenesAssociation/data/'
FILE_NAME = 'gene_gene_associations_50k.csv'
# LOAD DATA CSV USING PANDAS -----------------------------------------------------------------#
print "STEP 1: Loading Gene Nodes -------------------------------------------------------------"
# Read csv file and load as df
GENES = pd.read_csv(USER_FILE_PATH + DATA_PATH + FILE_NAME,
usecols=['OFFICIAL_SYMBOL_A'],
low_memory=True,
iterator=True,
chunksize=1000)
# Concatenate chunks into list & convert to dataFrame
GENES_DF = pd.DataFrame(pd.concat(list(GENES), ignore_index=True))
# Remove duplicates
GENES_DF_CLEAN = GENES_DF.drop_duplicates(keep='first')
# Name Columns
GENES_DF_CLEAN.columns = ['gene_id']
# Output dataFrame
print GENES_DF_CLEAN
# Create vertices
VERTICES = SQL_CONTEXT.createDataFrame(GENES_DF_CLEAN)
# Show some vertices
print VERTICES.take(5)
print "STEP 2: Loading Gene Edges -------------------------------------------------------------"
# Read csv file and load as df
EDGES = pd.read_csv(USER_FILE_PATH + DATA_PATH + FILE_NAME,
usecols=['OFFICIAL_SYMBOL_A', 'OFFICIAL_SYMBOL_B', 'EXPERIMENTAL_SYSTEM'],
low_memory=True,
iterator=True,
chunksize=1000)
# Concatenate chunks into list & convert to dataFrame
EDGES_DF = pd.DataFrame(pd.concat(list(EDGES), ignore_index=True))
# Name Columns
EDGES_DF.columns = ["src", "dst", "rel_type"]
# Output dataFrame
print EDGES_DF
# Create vertices
EDGES = SQL_CONTEXT.createDataFrame(EDGES_DF)
# Show some edges
print EDGES.take(5)
g = gf.GraphFrame(VERTICES, EDGES)
Needless to say, I have tried including the graphframes directory (look here to understand what I did) into spark's pyspark directory. But it seems like this not enough... Anything else I have tried just failed. Would appreciate some help with this. You can see below the error message I am getting:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/09/19 12:46:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/19 12:46:03 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
STEP 1: Loading Gene Nodes -------------------------------------------------------------
gene_id
0 MAP2K4
1 MYPN
2 ACVR1
3 GATA2
4 RPA2
5 ARF1
6 ARF3
8 XRN1
9 APP
10 APLP1
11 CITED2
12 EP300
13 APOB
14 ARRB2
15 CSF1R
16 PRRC2A
17 LSM1
18 SLC4A1
19 BCL3
20 ADRB1
21 BRCA1
25 ARVCF
26 PCBD1
27 PSEN2
28 CAPN3
29 ITPR1
30 MAGI1
31 RB1
32 TSG101
33 ORC1
... ...
49379 WDR26
49380 WDR5B
49382 NLE1
49383 WDR12
49385 WDR53
49386 WDR59
49387 WDR61
49409 CHD6
49422 DACT1
49424 KMT2B
49438 SMARCA1
49459 DCLRE1A
49469 F2RL1
49472 SENP8
49475 TSPY1
49479 SERPINB5
49521 HOXA11
49548 SYF2
49553 FOXN3
49557 MLANA
49608 REPIN1
49609 GMNN
49670 HIST2H2BE
49767 BCL7C
49797 SIRT3
49810 KLF4
49858 RHO
49896 MAGEA2
49907 SUV420H2
49958 SAP30L
[6025 rows x 1 columns]
16/09/19 12:46:08 WARN TaskSetManager: Stage 0 contains a task of very large size (107 KB). The maximum recommended task size is 100 KB.
[Row(gene_id=u'MAP2K4'), Row(gene_id=u'MYPN'), Row(gene_id=u'ACVR1'), Row(gene_id=u'GATA2'), Row(gene_id=u'RPA2')]
STEP 2: Loading Gene Edges -------------------------------------------------------------
src dst rel_type
0 MAP2K4 FLNC Two-hybrid
1 MYPN ACTN2 Two-hybrid
2 ACVR1 FNTA Two-hybrid
3 GATA2 PML Two-hybrid
4 RPA2 STAT3 Two-hybrid
5 ARF1 GGA3 Two-hybrid
6 ARF3 ARFIP2 Two-hybrid
7 ARF3 ARFIP1 Two-hybrid
8 XRN1 ALDOA Two-hybrid
9 APP APPBP2 Two-hybrid
10 APLP1 DAB1 Two-hybrid
11 CITED2 TFAP2A Two-hybrid
12 EP300 TFAP2A Two-hybrid
13 APOB MTTP Two-hybrid
14 ARRB2 RALGDS Two-hybrid
15 CSF1R GRB2 Two-hybrid
16 PRRC2A GRB2 Two-hybrid
17 LSM1 NARS Two-hybrid
18 SLC4A1 SLC4A1AP Two-hybrid
19 BCL3 BARD1 Two-hybrid
20 ADRB1 GIPC1 Two-hybrid
21 BRCA1 ATF1 Two-hybrid
22 BRCA1 MSH2 Two-hybrid
23 BRCA1 BARD1 Two-hybrid
24 BRCA1 MSH6 Two-hybrid
25 ARVCF CDH15 Two-hybrid
26 PCBD1 CACNA1C Two-hybrid
27 PSEN2 CAPN1 Two-hybrid
28 CAPN3 TTN Two-hybrid
29 ITPR1 CA8 Two-hybrid
... ... ... ...
49969 SAP30 HDAC3 Affinity Capture-Western
49970 BRCA1 RBBP8 Co-localization
49971 BRCA1 BRCA1 Biochemical Activity
49972 SET TREX1 Co-purification
49973 SET TREX1 Reconstituted Complex
49974 PLAGL1 EP300 Reconstituted Complex
49975 PLAGL1 CREBBP Reconstituted Complex
49976 EP300 PLAGL1 Affinity Capture-Western
49977 MTA1 ESR1 Reconstituted Complex
49978 SIRT2 EP300 Affinity Capture-Western
49979 EP300 SIRT2 Affinity Capture-Western
49980 EP300 HDAC1 Affinity Capture-Western
49981 EP300 SIRT2 Biochemical Activity
49982 MIER1 CREBBP Reconstituted Complex
49983 SMARCA4 SIN3A Affinity Capture-Western
49984 SMARCA4 HDAC2 Affinity Capture-Western
49985 ESR1 NCOA6 Affinity Capture-Western
49986 ESR1 TOP2B Affinity Capture-Western
49987 ESR1 PRKDC Affinity Capture-Western
49988 ESR1 PARP1 Affinity Capture-Western
49989 ESR1 XRCC5 Affinity Capture-Western
49990 ESR1 XRCC6 Affinity Capture-Western
49991 PARP1 TOP2B Affinity Capture-Western
49992 PARP1 PRKDC Affinity Capture-Western
49993 PARP1 XRCC5 Affinity Capture-Western
49994 PARP1 XRCC6 Affinity Capture-Western
49995 SIRT3 XRCC6 Affinity Capture-Western
49996 SIRT3 XRCC6 Reconstituted Complex
49997 SIRT3 XRCC6 Biochemical Activity
49998 HDAC1 PAX3 Affinity Capture-Western
[49999 rows x 3 columns]
16/09/19 12:46:11 WARN TaskSetManager: Stage 1 contains a task of very large size (1211 KB). The maximum recommended task size is 100 KB.
[Row(src=u'MAP2K4', dst=u'FLNC', rel_type=u'Two-hybrid'), Row(src=u'MYPN', dst=u'ACTN2', rel_type=u'Two-hybrid'), Row(src=u'ACVR1', dst=u'FNTA', rel_type=u'Two-hybrid'), Row(src=u'GATA2', dst=u'PML', rel_type=u'Two-hybrid'), Row(src=u'RPA2', dst=u'STAT3', rel_type=u'Two-hybrid')]
Traceback (most recent call last):
File "/Users/username/PycharmProjects/GenesAssociation/__init__.py", line 99, in <module>
g = gf.GraphFrame(VERTICES, EDGES)
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/pyspark/graphframes/graphframe.py", line 62, in __init__
self._jvm_gf_api = _java_api(self._sc)
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/pyspark/graphframes/graphframe.py", line 34, in _java_api
return jsc._jvm.Thread.currentThread().getContextClassLoader().loadClass(javaClassName) \
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip/py4j/java_gateway.py", line 933, in __call__
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip/py4j/protocol.py", line 312, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o50.loadClass.
: java.lang.ClassNotFoundException: org.graphframes.GraphFramePythonAPI
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:128)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:211)
at java.lang.Thread.run(Thread.java:745)
Process finished with exit code 1
Thanks in advance.
You can set PYSPARK_SUBMIT_ARGS either in your code
os.environ["PYSPARK_SUBMIT_ARGS"] = (
"--packages graphframes:graphframes:0.2.0-spark2.0-s_2.11 pyspark-shell"
)
spark = SparkSession.builder.getOrCreate()
or in PyCharm edit run configuration (Run -> Edit configuration -> Choose configuration -> Select configuration tab -> Choose Environment variables -> Add PYSPARK_SUBMIT_ARGS):
with a minimal working example:
import os
import sys
SPARK_HOME = ...
os.environ["SPARK_HOME"] = SPARK_HOME
# os.environ["PYSPARK_SUBMIT_ARGS"] = ... If not set in PyCharm config
sys.path.append(os.path.join(SPARK_HOME, "python"))
sys.path.append(os.path.join(SPARK_HOME, "python/lib/py4j-0.10.3-src.zip"))
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
v = spark.createDataFrame([("a", "foo"), ("b", "bar"),], ["id", "attr"])
e = spark.createDataFrame([("a", "b", "foobar")], ["src", "dst", "rel"])
from graphframes import *
g = GraphFrame(v, e)
g.inDegrees.show()
spark.stop()
You could also add the packages or jars to your spark-defaults.conf.
If you use Python 3 with graphframes 0.2 there is a known issue with extracting Python libraries from JAR so you'll have to do it manually. You can for example download JAR file, unzip it, and make sure that root directory with graphframes is on your Python path. This has been fixed in graphframes 0.3.