Using graphframes with PyCharm - python

I have spent almost 2 days scrolling the internet and I was unable to sort out this problem. I am trying to install the graphframes package (Version: 0.2.0-spark2.0-s_2.11) to run with spark through PyCharm, but, despite my best efforts, it's been impossible.
I have tried almost everything. Please, know that I have checked this site here as well before posting an answer.
Here is the code I am trying to run:
# IMPORT OTHER LIBS --------------------------------------------------------
import os
import sys
import pandas as pd
# IMPORT SPARK ------------------------------------------------------------------------------------#
# Path to Spark source folder
USER_FILE_PATH = "/Users/<username>"
SPARK_PATH = "/PycharmProjects/GenesAssociation"
SPARK_FILE = "/spark-2.0.0-bin-hadoop2.7"
SPARK_HOME = USER_FILE_PATH + SPARK_PATH + SPARK_FILE
os.environ['SPARK_HOME'] = SPARK_HOME
# Append pySpark to Python Path
sys.path.append(SPARK_HOME + "/python")
sys.path.append(SPARK_HOME + "/python" + "/lib/py4j-0.10.1-src.zip")
try:
from pyspark import SparkContext
from pyspark import SparkConf
from pyspark.sql import SQLContext
from pyspark.graphframes import GraphFrame
except ImportError as ex:
print "Can not import Spark Modules", ex
sys.exit(1)
# GLOBAL VARIABLES --------------------------------------------------------- -----------------------#
SC = SparkContext('local')
SQL_CONTEXT = SQLContext(SC)
# MAIN CODE ---------------------------------------------------------------------------------------#
if __name__ == "__main__":
# Main Path to CSV files
DATA_PATH = '/PycharmProjects/GenesAssociation/data/'
FILE_NAME = 'gene_gene_associations_50k.csv'
# LOAD DATA CSV USING PANDAS -----------------------------------------------------------------#
print "STEP 1: Loading Gene Nodes -------------------------------------------------------------"
# Read csv file and load as df
GENES = pd.read_csv(USER_FILE_PATH + DATA_PATH + FILE_NAME,
usecols=['OFFICIAL_SYMBOL_A'],
low_memory=True,
iterator=True,
chunksize=1000)
# Concatenate chunks into list & convert to dataFrame
GENES_DF = pd.DataFrame(pd.concat(list(GENES), ignore_index=True))
# Remove duplicates
GENES_DF_CLEAN = GENES_DF.drop_duplicates(keep='first')
# Name Columns
GENES_DF_CLEAN.columns = ['gene_id']
# Output dataFrame
print GENES_DF_CLEAN
# Create vertices
VERTICES = SQL_CONTEXT.createDataFrame(GENES_DF_CLEAN)
# Show some vertices
print VERTICES.take(5)
print "STEP 2: Loading Gene Edges -------------------------------------------------------------"
# Read csv file and load as df
EDGES = pd.read_csv(USER_FILE_PATH + DATA_PATH + FILE_NAME,
usecols=['OFFICIAL_SYMBOL_A', 'OFFICIAL_SYMBOL_B', 'EXPERIMENTAL_SYSTEM'],
low_memory=True,
iterator=True,
chunksize=1000)
# Concatenate chunks into list & convert to dataFrame
EDGES_DF = pd.DataFrame(pd.concat(list(EDGES), ignore_index=True))
# Name Columns
EDGES_DF.columns = ["src", "dst", "rel_type"]
# Output dataFrame
print EDGES_DF
# Create vertices
EDGES = SQL_CONTEXT.createDataFrame(EDGES_DF)
# Show some edges
print EDGES.take(5)
g = gf.GraphFrame(VERTICES, EDGES)
Needless to say, I have tried including the graphframes directory (look here to understand what I did) into spark's pyspark directory. But it seems like this not enough... Anything else I have tried just failed. Would appreciate some help with this. You can see below the error message I am getting:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel).
16/09/19 12:46:02 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
16/09/19 12:46:03 WARN Utils: Service 'SparkUI' could not bind on port 4040. Attempting port 4041.
STEP 1: Loading Gene Nodes -------------------------------------------------------------
gene_id
0 MAP2K4
1 MYPN
2 ACVR1
3 GATA2
4 RPA2
5 ARF1
6 ARF3
8 XRN1
9 APP
10 APLP1
11 CITED2
12 EP300
13 APOB
14 ARRB2
15 CSF1R
16 PRRC2A
17 LSM1
18 SLC4A1
19 BCL3
20 ADRB1
21 BRCA1
25 ARVCF
26 PCBD1
27 PSEN2
28 CAPN3
29 ITPR1
30 MAGI1
31 RB1
32 TSG101
33 ORC1
... ...
49379 WDR26
49380 WDR5B
49382 NLE1
49383 WDR12
49385 WDR53
49386 WDR59
49387 WDR61
49409 CHD6
49422 DACT1
49424 KMT2B
49438 SMARCA1
49459 DCLRE1A
49469 F2RL1
49472 SENP8
49475 TSPY1
49479 SERPINB5
49521 HOXA11
49548 SYF2
49553 FOXN3
49557 MLANA
49608 REPIN1
49609 GMNN
49670 HIST2H2BE
49767 BCL7C
49797 SIRT3
49810 KLF4
49858 RHO
49896 MAGEA2
49907 SUV420H2
49958 SAP30L
[6025 rows x 1 columns]
16/09/19 12:46:08 WARN TaskSetManager: Stage 0 contains a task of very large size (107 KB). The maximum recommended task size is 100 KB.
[Row(gene_id=u'MAP2K4'), Row(gene_id=u'MYPN'), Row(gene_id=u'ACVR1'), Row(gene_id=u'GATA2'), Row(gene_id=u'RPA2')]
STEP 2: Loading Gene Edges -------------------------------------------------------------
src dst rel_type
0 MAP2K4 FLNC Two-hybrid
1 MYPN ACTN2 Two-hybrid
2 ACVR1 FNTA Two-hybrid
3 GATA2 PML Two-hybrid
4 RPA2 STAT3 Two-hybrid
5 ARF1 GGA3 Two-hybrid
6 ARF3 ARFIP2 Two-hybrid
7 ARF3 ARFIP1 Two-hybrid
8 XRN1 ALDOA Two-hybrid
9 APP APPBP2 Two-hybrid
10 APLP1 DAB1 Two-hybrid
11 CITED2 TFAP2A Two-hybrid
12 EP300 TFAP2A Two-hybrid
13 APOB MTTP Two-hybrid
14 ARRB2 RALGDS Two-hybrid
15 CSF1R GRB2 Two-hybrid
16 PRRC2A GRB2 Two-hybrid
17 LSM1 NARS Two-hybrid
18 SLC4A1 SLC4A1AP Two-hybrid
19 BCL3 BARD1 Two-hybrid
20 ADRB1 GIPC1 Two-hybrid
21 BRCA1 ATF1 Two-hybrid
22 BRCA1 MSH2 Two-hybrid
23 BRCA1 BARD1 Two-hybrid
24 BRCA1 MSH6 Two-hybrid
25 ARVCF CDH15 Two-hybrid
26 PCBD1 CACNA1C Two-hybrid
27 PSEN2 CAPN1 Two-hybrid
28 CAPN3 TTN Two-hybrid
29 ITPR1 CA8 Two-hybrid
... ... ... ...
49969 SAP30 HDAC3 Affinity Capture-Western
49970 BRCA1 RBBP8 Co-localization
49971 BRCA1 BRCA1 Biochemical Activity
49972 SET TREX1 Co-purification
49973 SET TREX1 Reconstituted Complex
49974 PLAGL1 EP300 Reconstituted Complex
49975 PLAGL1 CREBBP Reconstituted Complex
49976 EP300 PLAGL1 Affinity Capture-Western
49977 MTA1 ESR1 Reconstituted Complex
49978 SIRT2 EP300 Affinity Capture-Western
49979 EP300 SIRT2 Affinity Capture-Western
49980 EP300 HDAC1 Affinity Capture-Western
49981 EP300 SIRT2 Biochemical Activity
49982 MIER1 CREBBP Reconstituted Complex
49983 SMARCA4 SIN3A Affinity Capture-Western
49984 SMARCA4 HDAC2 Affinity Capture-Western
49985 ESR1 NCOA6 Affinity Capture-Western
49986 ESR1 TOP2B Affinity Capture-Western
49987 ESR1 PRKDC Affinity Capture-Western
49988 ESR1 PARP1 Affinity Capture-Western
49989 ESR1 XRCC5 Affinity Capture-Western
49990 ESR1 XRCC6 Affinity Capture-Western
49991 PARP1 TOP2B Affinity Capture-Western
49992 PARP1 PRKDC Affinity Capture-Western
49993 PARP1 XRCC5 Affinity Capture-Western
49994 PARP1 XRCC6 Affinity Capture-Western
49995 SIRT3 XRCC6 Affinity Capture-Western
49996 SIRT3 XRCC6 Reconstituted Complex
49997 SIRT3 XRCC6 Biochemical Activity
49998 HDAC1 PAX3 Affinity Capture-Western
[49999 rows x 3 columns]
16/09/19 12:46:11 WARN TaskSetManager: Stage 1 contains a task of very large size (1211 KB). The maximum recommended task size is 100 KB.
[Row(src=u'MAP2K4', dst=u'FLNC', rel_type=u'Two-hybrid'), Row(src=u'MYPN', dst=u'ACTN2', rel_type=u'Two-hybrid'), Row(src=u'ACVR1', dst=u'FNTA', rel_type=u'Two-hybrid'), Row(src=u'GATA2', dst=u'PML', rel_type=u'Two-hybrid'), Row(src=u'RPA2', dst=u'STAT3', rel_type=u'Two-hybrid')]
Traceback (most recent call last):
File "/Users/username/PycharmProjects/GenesAssociation/__init__.py", line 99, in <module>
g = gf.GraphFrame(VERTICES, EDGES)
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/pyspark/graphframes/graphframe.py", line 62, in __init__
self._jvm_gf_api = _java_api(self._sc)
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/pyspark/graphframes/graphframe.py", line 34, in _java_api
return jsc._jvm.Thread.currentThread().getContextClassLoader().loadClass(javaClassName) \
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip/py4j/java_gateway.py", line 933, in __call__
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/pyspark/sql/utils.py", line 63, in deco
return f(*a, **kw)
File "/Users/username/PycharmProjects/GenesAssociation/spark-2.0.0-bin-hadoop2.7/python/lib/py4j-0.10.1-src.zip/py4j/protocol.py", line 312, in get_return_value
py4j.protocol.Py4JJavaError: An error occurred while calling o50.loadClass.
: java.lang.ClassNotFoundException: org.graphframes.GraphFramePythonAPI
at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:237)
at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:357)
at py4j.Gateway.invoke(Gateway.java:280)
at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:128)
at py4j.commands.CallCommand.execute(CallCommand.java:79)
at py4j.GatewayConnection.run(GatewayConnection.java:211)
at java.lang.Thread.run(Thread.java:745)
Process finished with exit code 1
Thanks in advance.

You can set PYSPARK_SUBMIT_ARGS either in your code
os.environ["PYSPARK_SUBMIT_ARGS"] = (
"--packages graphframes:graphframes:0.2.0-spark2.0-s_2.11 pyspark-shell"
)
spark = SparkSession.builder.getOrCreate()
or in PyCharm edit run configuration (Run -> Edit configuration -> Choose configuration -> Select configuration tab -> Choose Environment variables -> Add PYSPARK_SUBMIT_ARGS):
with a minimal working example:
import os
import sys
SPARK_HOME = ...
os.environ["SPARK_HOME"] = SPARK_HOME
# os.environ["PYSPARK_SUBMIT_ARGS"] = ... If not set in PyCharm config
sys.path.append(os.path.join(SPARK_HOME, "python"))
sys.path.append(os.path.join(SPARK_HOME, "python/lib/py4j-0.10.3-src.zip"))
from pyspark.sql import SparkSession
spark = SparkSession.builder.getOrCreate()
v = spark.createDataFrame([("a", "foo"), ("b", "bar"),], ["id", "attr"])
e = spark.createDataFrame([("a", "b", "foobar")], ["src", "dst", "rel"])
from graphframes import *
g = GraphFrame(v, e)
g.inDegrees.show()
spark.stop()
You could also add the packages or jars to your spark-defaults.conf.
If you use Python 3 with graphframes 0.2 there is a known issue with extracting Python libraries from JAR so you'll have to do it manually. You can for example download JAR file, unzip it, and make sure that root directory with graphframes is on your Python path. This has been fixed in graphframes 0.3.

Related

Mac m1 pyodbc.connect causes zsh: abort

When I run pyodbc.connect(...) my python crashes and I get zsh: abort error in terminal.
Running:
Mac M1 Pro Monterey
Python 3.8.9
Installed MS ODBC drivers (tried 17 and 18).
Tried running in base Python and virtual environments.
In base python, I get the following system error:
Translated Report (Full Report Below)
Process: Python [1869]
Path: /Library/Developer/CommandLineTools/Library/Frameworks/Python3.framework/Versions/3.8/Resources/Python.app/Contents/MacOS/Python
Identifier: com.apple.python3
Version: 3.8.9 (3.8.9)
Build Info: python3-103000000000000~1538
Code Type: ARM-64 (Native)
Parent Process: zsh [611]
Responsible: Terminal [80951]
User ID: 501
Date/Time: 2022-03-08 17:42:55.3063 +0200
OS Version: macOS 12.2.1 (21D62)
Report Version: 12
Anonymous UUID: 919246FC-3C09-6151-CCAB-05C65A4A9B63
Sleep/Wake UUID: 4CFF985F-843A-4C63-ABC8-97435300AADE
Time Awake Since Boot: 370000 seconds
Time Since Wake: 2014 seconds
System Integrity Protection: enabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_CRASH (SIGABRT)
Exception Codes: 0x0000000000000000, 0x0000000000000000
Exception Note: EXC_CORPSE_NOTIFY
Application Specific Information:
stack buffer overflow
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libsystem_kernel.dylib 0x18ff659b8 __pthread_kill + 8
1 libsystem_pthread.dylib 0x18ff98eb0 pthread_kill + 288
2 libsystem_c.dylib 0x18fed63a0 __abort + 128
3 libsystem_c.dylib 0x18fec818c __stack_chk_fail + 96
4 pyodbc.cpython-38-darwin.so 0x102c4f5d4 GetErrorFromHandle(Connection*, char const*, void*, void*) + 1024
5 pyodbc.cpython-38-darwin.so 0x102c4f188 RaiseErrorFromHandle(Connection*, char const*, void*, void*) + 16
6 pyodbc.cpython-38-darwin.so 0x102c48cf0 Connection_New(_object*, bool, bool, long, bool, _object*, Object&) + 1324
7 pyodbc.cpython-38-darwin.so 0x102c54b34 mod_connect(_object*, _object*, _object*) + 1468
8 Python3 0x103185968 cfunction_call_varargs + 140
9 Python3 0x103185368 _PyObject_MakeTpCall + 372
10 Python3 0x1032523fc call_function + 448
11 Python3 0x10324f800 _PyEval_EvalFrameDefault + 23692
12 Python3 0x1032532e4 _PyEval_EvalCodeWithName + 3048
13 Python3 0x103249ae0 PyEval_EvalCode + 60
14 Python3 0x10328f384 PyRun_InteractiveOneObjectEx + 712
15 Python3 0x10328e92c PyRun_InteractiveLoopFlags + 156
16 Python3 0x10328e84c PyRun_AnyFileExFlags + 72
17 Python3 0x1032ac808 Py_RunMain + 2120
18 Python3 0x1032acb1c pymain_main + 340
19 Python3 0x1032acb98 Py_BytesMain + 40
20 dyld 0x1029ed0f4 start + 520
(this is just a small extract of the report because I can't figure out how to "embed" the full report in a scrollable object in my post)
I have tried zsh: abort python error when I try to run the app in venv
Any help is appreciated.

TypeError: analyzers() missing 1 required positional argument: 'self'

I am getting the following error when trying to execute publicly available code (lines 200-205 "Features Drift" section) from https://github.com/evidentlyai/evidently/blob/main/evidently/tutorials/historical_drift_visualization.ipynb:
TypeError Traceback (most recent call last)
<ipython-input-17-0cdbc9afe012> in <module>
3
4 for date in experiment_batches:
----> 5 drifts = detect_features_drift(raw_data.loc[reference_dates[0]:reference_dates[1]],
6 raw_data.loc[date[0]:date[1]],
7 column_mapping=data_columns,
<ipython-input-16-044c700989fb> in detect_features_drift(reference, production, column_mapping, confidence, threshold, get_pvalues)
8 """
9
---> 10 data_drift_profile = Profile(sections=[DataDriftProfileSection])
11 data_drift_profile.calculate(reference, production, column_mapping=column_mapping)
12 report = data_drift_profile.json()
~\AppData\Roaming\Python\Python38\site-packages\evidently\model_profile\model_profile.py in __init__(self, sections, options)
16
17 def __init__(self, sections: Sequence[ProfileSection], options: Optional[list] = None):
---> 18 super().__init__(sections, options if options is not None else [])
19 self.result = {}
20
~\AppData\Roaming\Python\Python38\site-packages\evidently\pipeline\pipeline.py in __init__(self, stages, options)
20 self.analyzers_results = {}
21 self.options_provider = OptionsProvider()
---> 22 self._analyzers = list(itertools.chain.from_iterable([stage.analyzers() for stage in stages]))
23 for option in options:
24 self.options_provider.add(option)
~\AppData\Roaming\Python\Python38\site-packages\evidently\pipeline\pipeline.py in <listcomp>(.0)
20 self.analyzers_results = {}
21 self.options_provider = OptionsProvider()
---> 22 self._analyzers = list(itertools.chain.from_iterable([stage.analyzers() for stage in stages]))
23 for option in options:
24 self.options_provider.add(option)
TypeError: analyzers() missing 1 required positional argument: 'self'
I obtained a similar error after trying to execute lines 195-196 (section: Dataset drift).
I use Python 3.8.8.
I installed Evidently with some problems. After successful installation (pip install --user Evidently) I got the following error:
mitosheet 0.1.361 requires plotly==5.3.0, but you have plotly 4.12.0 which is incompatible
and after (pip install plotly==5.3.0) pip informed me that
0.1.34.dev0 requires plotly~=4.12.0, but you have plotly 5.3.0 which is incompatible.
Simply add parenthesis in front of it.
For example:
data_and_target_drift_report = Dashboard(tabs=[DataDriftTab(),CatTargetDriftTab()])

Python - Exception: Data must be 1-dimensional when running pd.Series

I want to run the following code. My goal is to produce a .csv dataframe from a netcdf file in which I have isolated variables. I also copied the print(my_file) below for you to see the structure of my netcdf file. The error occurs when I run pd.Series as shown below :
temp_ts = pd.Series(temp, index=dtime)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/gpfs/apps/miniconda3/lib/python3.7/site-packages/pandas/core/series.py", line 305, in __init__
data = sanitize_array(data, index, dtype, copy, raise_cast_failure=True)
File "/gpfs/apps/miniconda3/lib/python3.7/site-packages/pandas/core/construction.py", line 482, in sanitize_array
raise Exception("Data must be 1-dimensional")
Exception: Data must be 1-dimensional
Below, all of my code :
#netcdf to .csv
import netCDF4
import pandas as pd
import numpy as np
temp_nc_file = '/gpfs/home/UDCPP/barrier_c/Test_NCO/temp_Corse_10m_201704.nc'
nc = netCDF4.Dataset(temp_nc_file, mode='r')
nc.variables.keys()
lat = nc.variables['latitude'][:]
lon = nc.variables['longitude'][:]
time_var = nc.variables['time']
dtime = netCDF4.num2date(time_var[:],time_var.units)
temp = nc.variables['TEMP'][:]
# a pandas.Series designed for time series of a 2D lat,lon grid
temp_ts = pd.Series(temp, index=dtime)
temp_ts.to_csv('temp.csv',index=True, header=True)
And finally the print(my_file) result if useful :
print(nc)
<class 'netCDF4._netCDF4.Dataset'>
root group (NETCDF4_CLASSIC data model, file format HDF5):
limi: 0
lima: 1100
pasi: 1
ljmi: 0
ljma: 462
pasj: 1
lkmi: 1
lkma: 60
pask: 1
global_imin: 0
global_imax: 1100
global_jmin: 0
global_jmax: 462
data_type: OCO oriented grid
format_version: 1.3.1
Conventions: CF-1.6 OCO-1.3.1 COMODO-1.0
netcdf_version: 4.1.2
product_version: 1.0
references: http://www.previmer.org/
easting: longitude
northing: latitude
grid_projection: n/a
distribution_statement: Data restrictions: for registered users only
institution: IFREMER
institution_references: http://www.ifremer.fr/
data_centre: IFREMER OCO DATA CENTER
data_centre_references: http://www.previmer.org/
title: PREVIMER MENOR 1200 forecast
creation_date: 2017-01-03T20:21:27Z
run_time: 2017-01-03T20:21:27Z
history: Tue Jun 9 18:24:38 2020: nces -O -d level,10,10 temp_Corse_201704.nc temp_Corse_10m_201704.nc
Tue Jun 9 15:41:33 2020: nces -O -d ni,565,700 -d nj,150,350 temp_201704.nc temp_Corse_201704.nc
Tue Jun 9 15:32:46 2020: ncks -O -v TEMP champs_meno_BE201704.nc temp_201704.nc
Mon Nov 6 10:27:14 2017: /appli/nco/nco-4.6.4__gcc-6.3.0/bin/ncrcat /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170101T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170102T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170103T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170104T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170105T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170106T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170107T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170108T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170109T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170110T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170111T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170112T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170113T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170114T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170115T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170116T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170117T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170118T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170119T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170120T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170121T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170122T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170123T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170124T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170125T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170126T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170127T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170128T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170129T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170130T2100Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0000Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0300Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0600Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T0900Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T1200Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T1500Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T1800Z.nc /home1/datawork/scoudray/DATARESU/MENOR/201701BE/MARC_F2-MARS3D-MENOR1200_20170131T2100Z.nc /home1/datawork//scoudray/DATARESU/MENOR/2017BE/champs_meno_BE201701.nc
2017-01-03T20:21:27Z: creation
model_name: MARS
arakawa_grid_type: C1
source: MARS3D V10.10
area: North Western Mediterranean Sea
southernmost_latitude: 39.5000
northernmost_latitude: 44.5000
latitude_resolution: 1.082250000000000E-002
westernmost_longitude: 0.0000
easternmost_longitude: 15.9999
longitude_resolution: 1.454540000000000E-002
minimum_depth: 5.000000
maximum_depth: 3500.000
depth_resolution: n/a
forecast_range: 4-days forecast
forecast_type: forecast
operational_status: experimental
NCO: netCDF Operators version 4.9.3 (Homepage = http://nco.sf.net, Code = http://github.com/nco/nco)
start_date: 2017-01-01T00:00:00Z
stop_date: 2017-01-01T00:00:00Z
software_version: PREVIMER forecasting system v2
product_name: PREVIMER_F2-MARS3D-MENOR1200_20170101T0000Z.nc
field_type: 3-hourly
comment: Use of Meteo-France ARPEGEHR meteorological data
contact: cdoco-exploit#ifremer.fr
quality_index: 0
nco_openmp_thread_number: 1
dimensions(sizes): nj(201), ni(136), time(248), level(1)
variables(dimensions): float32 H0(nj,ni), float32 TEMP(time,level,nj,ni), float32 XE(time,nj,ni), float32 b(), float32 hc(nj,ni), float64 latitude(nj,ni), float32 level(level), float64 longitude(nj,ni), float32 ni(ni), float32 nj(nj), float32 theta(), float64 time(time)
groups:
THANK YOU IN ADVANCE FOR YOUR HELP !
I'm not sure if this is what you are looking for but you could try this.
import xarray as xr
file = xr.open_dataset('/gpfs/home/UDCPP/barrier_c/Test_NCO/temp_Corse_10m_201704.nc')
temp_ts = file['TEMP'].to_series()
temp_ts.to_csv('temp.csv',index=True, header=True)

Segmentation Violation when running MATLAB r2017a with caffe (python script)

I get the following crash when running Matlab r2017a with caffe from python script (selective_search.py from https://github.com/sergeyk/selective_search_ijcv_with_python). I run it with user root on ubuntu 17.10 and CPU mode only!. I also tried to run *.m file with matlab and it works fine. you can see below more details. also you can see below the matlab script parameters (['matlab', '-softwareopengl' atc)
root#desktop-home:/home/USERNAME/deepLearning/hed/python# python detect.py /home.USERNAME/Desktop/input.txt /home/USERNAME/Desktop/output.csv
CPU mode !!!!!!
WARNING: Logging before InitGoogleLogging() is written to STDERR
I0102 22:59:10.099709 3052 net.cpp:50] Initializing net from parameters:
name: "FCN"
input: "data"
input_dim: 1
input_dim: 3
...
...
...
I0102 22:59:10.636869 3052 upgrade_proto.cpp:620] Attempting to upgrade input file specified using deprecated V1LayerParameter: /home/USERNAME/deepLearning/hed/examples/hed/5stage-vgg.caffemodel
I0102 22:59:10.847916 3052 upgrade_proto.cpp:628] Successfully upgraded file specified using deprecated V1LayerParameter
Loading input...
selective_search_rcnn({'/home/USERNAME/Desktop/camera_500_X_500.jpg'}, '/tmp/tmphzvr5w.mat')
['matlab', '-softwareopengl', '-nojvm', '-r', "try; selective_search_rcnn({'/home/USERNAME/Desktop/camera_500_X_500.jpg'}, '/tmp/tmphzvr5w.mat'); catch; exit; end; exit"]
['matlab', '-softwareopengl', '-nojvm', '-r', "try; selective_search_rcnn({'/home/USERNAME/Desktop/camera_500_X_500.jpg'}, '/tmp/tmphzvr5w.mat'); catch; exit; end; exit"]
/home/USERNAME/deepLearning/hed/python/selective_search_ijcv_with_python
No protocol specified
------------------------------------------------------------------------
Segmentation violation detected at Tue Jan 2 22:59:13 2018
------------------------------------------------------------------------
Configuration:
Crash Decoding : Disabled - No sandbox or build area path
Crash Mode : continue (default)
Current Graphics Driver: Unknown software
Current Visual : None
Default Encoding : UTF-8
Deployed : false
GNU C Library : 2.26 stable
Host Name : USERNAME-home
MATLAB Architecture : glnxa64
MATLAB Entitlement ID: 3324953
MATLAB Root : /usr/local/MATLAB/R2017a
MATLAB Version : 9.2.0.556344 (R2017a)
OpenGL : software
Operating System : Linux 4.13.0-21-generic #24-Ubuntu SMP Mon Dec 18 17:29:16 UTC 2017 x86_64
Processor ID : x86 Family 6 Model 158 Stepping 9, GenuineIntel
Window System : No active display
Fault Count: 1
Abnormal termination:
Segmentation violation
Register State (from fault):
RAX = 00007f80b7df83d0 RBX = 0000000000000000
RCX = 0000000000000000 RDX = 00007f80b7df8600
RSP = 00007f80b7df82a0 RBP = 00007f80b7df8790
RSI = 00007f80bc3b6833 RDI = 00007f80b7df8390
R8 = 0000000000000000 R9 = 0000000000000000
R10 = 00000000000003ad R11 = 00007f80dbe55fe0
R12 = 00007f80b7df8390 R13 = 00007f80b7df8600
R14 = 00007f80b7df88a0 R15 = 00007f80dc10a7a0
RIP = 00007f80bc37c628 EFL = 0000000000010202
CS = 0033 FS = 0000 GS = 0000
Stack Trace (from fault):
[ 0] 0x00007f80bc37c628 bin/glnxa64/libmwddux_impl.so+00661032
[ 1] 0x00007f80bc356be9 bin/glnxa64/libmwddux_impl.so+00506857
[ 2] 0x00007f80bc3466b6 bin/glnxa64/libmwddux_impl.so+00439990
[ 3] 0x00007f80bc34c85a bin/glnxa64/libmwddux_impl.so+00464986
[ 4] 0x00007f80bc348370 bin/glnxa64/libmwddux_impl.so+00447344
[ 5] 0x00007f80d7cbe7d2 bin/glnxa64/libmwms.so+03168210
[ 6] 0x00007f80bc34808a bin/glnxa64/libmwddux_impl.so+00446602
[ 7] 0x00007f80da64bf65 /usr/local/MATLAB/R2017a/bin/glnxa64/libboost_thread.so.1.56.0+00073573
[ 8] 0x00007f80db6757fc /lib/x86_64-linux-gnu/libpthread.so.0+00030716
[ 9] 0x00007f80db3a2b0f /lib/x86_64-linux-gnu/libc.so.6+01133327 clone+00000063
[ 10] 0x0000000000000000 <unknown-module>+00000000
If this problem is reproducible, please submit a Service Request via:
http://www.mathworks.com/support/contact_us/
A technical support engineer might contact you with further information.
Thank you for your help.** This crash report has been saved to disk as /root/matlab_crash_dump.3072-1 **
MATLAB is exiting because of fatal error
Traceback (most recent call last):
File "detect.py", line 178, in <module>
main(sys.argv)
File "detect.py", line 149, in main
detections = detector.detect_selective_search(inputs)
File "/home/USERNAME/deepLearning/hed/python/caffe/detector.py", line 120, in detect_selective_search
cmd='selective_search_rcnn'
File "/home/USERNAME/deepLearning/hed/python/selective_search_ijcv_with_python/selective_search.py", line 51, in get_windows
raise Exception("Matlab script did not exit successfully!")
Exception: Matlab script did not exit successfully!

Extract parts of chunked data in Python

I have a data that looks like this:
INFO : Reading PDB list file 'model3.list'
INFO : Successfully read 10 / 10 PDBs from list file 'model3.list'
INFO : Successfully read 10 Chain structures
INFO : Processed 40 of 45 MAXSUBs
INFO : CPU time = 0.02 seconds
INFO : ======================================
INFO : 3D-Jury (Threshold: > 10 pairs # > 0.200)
INFO : ======================================
INFO : Rank Model Pairs File
INFO : 1 : 1 151 pdbs2/model.165.pdb
INFO : 2 : 7 145 pdbs2/model.150.pdb
INFO : 3 : 6 144 pdbs2/model.144.pdb
INFO : 4 : 9 142 pdbs2/model.125.pdb
INFO : 5 : 4 137 pdbs2/model.179.pdb
INFO : 6 : 8 137 pdbs2/model.191.pdb
INFO : 7 : 10 137 pdbs2/model.147.pdb
INFO : 8 : 3 135 pdbs2/model.119.pdb
INFO : 9 : 5 131 pdbs2/model.118.pdb
INFO : 10 : 2 129 pdbs2/model.128.pdb
INFO : ======================================
INFO : Pairwise single linkage clustering
INFO : ======================================
INFO : Hierarchical Tree
INFO : ======================================
INFO : Node Item 1 Item 2 Distance
INFO : 0 : 6 1 0.476 pdbs2/model.144.pdb pdbs2/model.165.pdb
INFO : -1 : 7 4 0.484 pdbs2/model.150.pdb pdbs2/model.179.pdb
INFO : -2 : 9 2 0.576 pdbs2/model.125.pdb pdbs2/model.128.pdb
INFO : -3 : -2 0 0.598
INFO : -4 : 10 -3 0.615 pdbs2/model.147.pdb
INFO : -5 : -1 -4 0.618
INFO : -6 : 8 -5 0.620 pdbs2/model.191.pdb
INFO : -7 : 3 -6 0.626 pdbs2/model.119.pdb
INFO : -8 : 5 -7 0.629 pdbs2/model.118.pdb
INFO : ======================================
INFO : 1 Clusters # Threshold 0.800 (0.8)
INFO : ======================================
INFO : Item Cluster
INFO : 1 : 1 pdbs2/model.165.pdb
INFO : 2 : 1 pdbs2/model.128.pdb
INFO : 3 : 1 pdbs2/model.119.pdb
INFO : 4 : 1 pdbs2/model.179.pdb
INFO : 5 : 1 pdbs2/model.118.pdb
INFO : 6 : 1 pdbs2/model.144.pdb
INFO : 7 : 1 pdbs2/model.150.pdb
INFO : 8 : 2 pdbs2/model.191.pdb
INFO : 9 : 2 pdbs2/model.125.pdb
INFO : 10 : 2 pdbs2/model.147.pdb
INFO : ======================================
INFO : Centroids
INFO : ======================================
INFO : Cluster Centroid Size Spread
INFO : 1 : 1 10 0.566 pdbs2/model.165.pdb
INFO : 2 : 10 3 0.777 pdbs2/model.147.pdb
INFO : ======================================
And it constitutes a chunk of many more data.
The chunks are denoted with starting line
INFO : Reading PDB list file 'model3.list'
What I want to do is to extract parts of chunk here:
INFO : ======================================
INFO : Cluster Centroid Size Spread
INFO : 1 : 1 10 0.566 pdbs2/model.165.pdb
INFO : 2 : 10 3 0.777 pdbs2/model.147.pdb
INFO : ======================================
At the end of the day a dictionary that looks like this:
{1:"10 pdbs2/model.165.pdb",
2:"3 pdbs2/model.147.pdb"}
Namely with cluster number as key and values as cluster size + file_model name.
What's the way to achieve that in Python?
I'm stuck with this code:
import csv
import json
import os
import argparse
import re
def main():
"""docstring for main"""
file = "max_cluster_output.txt"
with open(file, 'r') as tsvfile:
tabreader = csv.reader(tsvfile, delimiter=' ')
for line in tabreader:
linelen = len(line)
if "Centroids" in line:
print line
#if linelen >= 32 and linelen <= 34:
# print linelen, line
if __name__ == '__main__':
main()
I would do this using regexes.
I would have an outer loop that
reads lines until it finds "INFO : Reading PDB list file"
reads lines until it finds "INFO : Cluster Centroid Size Spread"
inner loop that:
creates dictionary entries from each subsequent line, until the line no longer matches
INFO: <number> : <number> <number> <number> <string>
It would look something like this (not tested):
import re
FILENAME = "foo.txt"
info = {}
try:
with open(FILENAME) as f:
while True:
for line in f:
if re.match("^INFO\s+:\s+Reading PDB list file", line):
break
for line in f:
if re.match("^INFO\s+:\s+Cluster\s+Centroid\s+Size\s+Spread", line):
break
# We're up to the data
for line in f:
# look for INFO : Cluster-number Centroid-number Size-number spread-blah File-string
match = re.match(^INFO\s+:\s+(?P<Cluster>\d+)\s+\d+\s+(?P<Size>\d+).*\s(?P<FileName>\S+)$, line)
if match:
info[match.group("Cluster")] = "%s %s" % (match.group('Size'), match.group("FileName"))
else:
break
except StopIteration:
print "done"
This code is here just to show the types of things to use (looping, file iterator, breaking, regexes) ... its by no means necessarily the most elegant way (and it is not tested).

Categories

Resources