R DTW multivariate series with asymmetric step fails to compute alignment - python

I'm using the DTW implementation found in R along with the python bindings in order to verify the effects of changing different parameters(like local constraint, local distance function and others) for my data. The data represents feature vectors that an audio processing frontend outputs(MFCC). Because of this I am dealing with multivariate time series, each feature vector has a size of 8. The problem I'm facing is when I try to use certain local constraints ( or step patterns ) I get the following error:
Error in if (is.na(gcm$distance)) { : argument is of length zero
Traceback (most recent call last):
File "r_dtw_simplified.py", line 32, in <module>
alignment = R.dtw(canDist, rNull, "Euclidean", stepPattern, "none", True, Fa
lse, True, False )
File "D:\Python27\lib\site-packages\rpy2\robjects\functions.py", line 86, in _
_call__
return super(SignatureTranslatedFunction, self).__call__(*args, **kwargs)
File "D:\Python27\lib\site-packages\rpy2\robjects\functions.py", line 35, in _
_call__
res = super(Function, self).__call__(*new_args, **new_kwargs)
rpy2.rinterface.RRuntimeError: Error in if (is.na(gcm$distance)) { : argument is
of length zero
Because the process of generating and adapting the input data is complicated I only made a simplified script to ilustrate the error i'm receiving.
#data works
#reference = [[-0.126678, -1.541763, 0.29985, 1.719757, 0.755798, -3.594681, -1.492798, 3.493042], [-0.110596, -1.638184, 0.128174, 1.638947, 0.721085, -3.247696, -0.920013, 3.763977], [-0.022415, -1.643539, -0.130692, 1.441742, 1.022064, -2.882172, -0.952225, 3.662842], [0.071259, -2.030411, -0.531891, 0.835114, 1.320419, -2.432281, -0.469116, 3.871094], [0.070526, -2.056702, -0.688293, 0.530396, 1.962128, -1.681915, -0.368973, 4.542419], [0.047745, -2.005127, -0.798203, 0.616028, 2.146988, -1.895874, 0.371597, 4.090881], [0.013962, -2.162796, -1.008545, 0.363495, 2.062866, -0.856613, 0.543884, 4.043335], [0.066757, -2.152969, -1.087097, 0.257263, 2.592697, -0.422424, -0.280533, 3.327576], [0.123123, -2.061035, -1.012863, 0.389282, 2.50206, 0.078186, -0.887711, 2.828247], [0.157455, -2.060425, -0.790344, 0.210419, 2.542114, 0.016983, -0.959274, 1.916504], [0.029648, -2.128204, -1.047318, 0.116547, 2.44899, 0.166534, -0.677551, 2.49231], [0.158554, -1.821365, -1.045044, 0.374207, 2.426712, 0.406952, -1.055084, 2.543762], [0.077026, -1.863235, -1.14827, 0.277069, 2.669067, 0.362549, -1.294342, 1.66748], [0.101822, -1.800293, -1.126801, 0.364594, 2.503815, 0.294846, -0.881302, 1.281616], [0.166138, -1.627762, -0.866013, 0.494476, 2.450668, 0.569, -1.392868, 0.651184], [0.225006, -1.596069, -1.07634, 0.550049, 2.167435, 0.554123, -1.432983, 1.166931], [0.114777, -1.462769, -0.793167, 0.565704, 2.183792, 0.345978, -1.410919, 0.708679], [0.144028, -1.444458, -0.831985, 0.536652, 2.222366, 0.330368, -0.715149, 0.517212], [0.147888, -1.450577, -0.809372, 0.479584, 2.271378, 0.250763, -0.540359, -0.036072], [0.090714, -1.485474, -0.888153, 0.268768, 2.001221, 0.412537, -0.698868, 0.17157], [0.11972, -1.382767, -0.890457, 0.218414, 1.666519, 0.659592, -0.069641, 0.914307], [0.189774, -1.18428, -0.785797, 0.106659, 1.429977, 0.195236, 0.627029, 0.503296], [0.194702, -1.098068, -0.956818, 0.020386, 1.369247, 0.10437, 0.641724, 0.410767], [0.215134, -1.069092, -1.11644, 0.283234, 1.313507, 0.110962, 0.600861, 0.752869], [0.216766, -1.065338, -1.047974, 0.080231, 1.500702, -0.113388, 0.712646, 0.914307], [0.259933, -0.964386, -0.981369, 0.092224, 1.480667, -0.00238, 0.896255, 0.665344], [0.265991, -0.935257, -0.93779, 0.214966, 1.235275, 0.104782, 1.33754, 0.599487], [0.266098, -0.62619, -0.905792, 0.131409, 0.402908, 0.103363, 1.352814, 1.554688], [0.273468, -0.354691, -0.709579, 0.228027, 0.315125, -0.15564, 0.942123, 1.024292], [0.246429, -0.272522, -0.609924, 0.318604, -0.007355, -0.165756, 1.07019, 1.087708], [0.248596, -0.232468, -0.524887, 0.53009, -0.476334, -0.184479, 1.088089, 0.667358], [0.074478, -0.200455, -0.058411, 0.662811, -0.111923, -0.686462, 1.205154, 1.271912], [0.063065, -0.080765, 0.065552, 0.79071, -0.569946, -0.899506, 0.875687, 0.095215], [0.117706, -0.270584, -0.021027, 0.723694, -0.200073, -0.365158, 0.892624, -0.152466], [0.00148, -0.075348, 0.017761, 0.757507, 0.719299, -0.355362, 0.749329, 0.315247], [0.035034, -0.110794, 0.038559, 0.949677, 0.478699, 0.005951, 0.097305, -0.388245], [-0.101944, -0.392487, 0.401886, 1.154938, 0.199127, 0.117371, -0.070007, -0.562439], [-0.083282, -0.388657, 0.449066, 1.505951, 0.46405, -0.566208, 0.216293, -0.528076], [-0.152054, -0.100113, 0.833054, 1.746857, 0.085861, -1.314102, 0.294632, -0.470947], [-0.166672, -0.183777, 0.988373, 1.925262, -0.202057, -0.961441, 0.15242, 0.594421], [-0.234573, -0.227707, 1.102112, 1.802002, -0.382492, -1.153336, 0.29335, 0.074036], [-0.336426, 0.042435, 1.255096, 1.804535, -0.610153, -0.810745, 1.308441, 0.599854], [-0.359344, 0.007248, 1.344543, 1.441559, -0.758286, -0.800079, 1.0233, 0.668213], [-0.321823, 0.027618, 1.1521, 1.509827, -0.708267, -0.668152, 1.05722, 0.710571], [-0.265335, 0.012344, 1.491501, 1.844971, -0.584137, -1.042419, -0.449188, 0.5354], [-0.302399, 0.049698, 1.440643, 1.674866, -0.626633, -1.158554, -0.906937, 0.405579], [-0.330276, 0.466675, 1.444153, 0.855499, -0.645447, -0.352158, 0.730423, 0.429932], [-0.354721, 0.540207, 1.570786, 0.626648, -0.897446, -0.007416, 0.174042, 0.100525], [-0.239609, 0.669983, 0.978851, 0.85321, -0.156784, 0.107986, 0.915054, 0.114197], [-0.189346, 0.930756, 0.824295, 0.516083, -0.339767, -0.206314, 0.744049, -0.36377]]
#query = [[0.387268, -1.21701, -0.432266, -1.394104, -0.458984, -1.469788, 0.12764, 2.310059], [0.418091, -1.389526, -0.150146, -0.759155, -0.578003, -2.123199, 0.276001, 3.022339], [0.264694, -1.526886, -0.238907, -0.511108, -0.90683, -2.699249, 0.692032, 2.849854], [0.246628, -1.675171, -0.533432, 0.070007, -0.392151, -1.739227, 0.534485, 2.744019], [0.099335, -1.983826, -0.985291, 0.428833, 0.26535, -1.285583, -0.234451, 2.4729], [0.055893, -2.108063, -0.401825, 0.860413, 0.724106, -1.959137, -1.360458, 2.350708], [-0.131592, -1.928314, -0.056213, 0.577698, 0.859146, -1.812286, -1.21669, 2.2052], [-0.162796, -2.149933, 0.467239, 0.524231, 0.74913, -1.829498, -0.741913, 1.616577], [-0.282745, -1.971008, 0.837616, 0.56427, 0.198288, -1.826935, -0.118027, 1.599731], [-0.497223, -1.578705, 1.277298, 0.682983, 0.055084, -2.032562, 0.64151, 1.719238], [-0.634232, -1.433258, 1.760513, 0.550415, -0.053787, -2.188568, 1.666687, 1.611938], [-0.607498, -1.302826, 1.960556, 1.331726, 0.417633, -2.271973, 2.095001, 0.9823], [-0.952957, -0.222076, 0.772064, 2.062256, -0.295258, -1.255371, 3.450974, -0.047607], [-1.210587, 1.00061, 0.036392, 1.952209, 0.470123, 0.231628, 2.670502, -0.608276], [-1.213287, 0.927002, -0.414825, 2.104065, 1.160126, 0.088898, 1.32959, -0.018311], [-1.081558, 1.007751, -0.337509, 1.7146, 0.653687, 0.297089, 1.916733, -0.772461], [-1.064804, 1.284302, -0.393585, 2.150635, 0.132294, 0.443298, 1.967575, 0.775513], [-0.972366, 1.039734, -0.588135, 1.413818, 0.423813, 0.781494, 1.977509, -0.556274], [-0.556381, 0.591309, -0.678314, 1.025635, 1.094284, 2.234711, 1.504013, -1.71875], [-0.063477, 0.626129, 0.360489, 0.149902, 0.92804, 0.936493, 1.203018, 0.264282], [0.162003, 0.577698, 0.956863, -0.477051, 1.081161, 0.817749, 0.660843, -0.428711], [-0.049515, 0.423615, 0.82489, 0.446228, 1.323853, 0.562775, -0.144196, 1.145386], [-0.146851, 0.171906, 0.304871, 0.320435, 1.378937, 0.673004, 0.188416, 0.208618], [0.33992, -2.072418, -0.447968, 0.526794, -0.175858, -1.400299, -0.452454, 1.396606], [0.226089, -2.183441, -0.301071, -0.475159, 0.834961, -2.191864, -1.092361, 2.434814], [0.279556, -2.073181, -0.517639, -0.766479, 0.974808, -2.070374, -2.003891, 2.706421], [0.237961, -1.9245, -0.708435, -0.582153, 1.285934, -1.75882, -2.146164, 2.369995], [0.149658, -1.703705, -0.539749, -0.215332, 1.369705, -1.484802, -1.506256, 1.04126], [0.078735, -1.719543, 0.157013, 0.382385, 1.100998, -0.223755, 0.021683, -0.545654], [0.106003, -1.404358, 0.372345, 1.881165, -0.292511, -0.263855, 1.579529, -1.426025], [0.047729, -1.198608, 0.600769, 1.901123, -1.106949, 0.128815, 1.293701, -1.364258], [0.110748, -0.894348, 0.712601, 1.728699, -1.250381, 0.674377, 0.812302, -1.428833], [0.085754, -0.662903, 0.794312, 1.102844, -1.234283, 1.084442, 0.986938, -1.10022], [0.140823, -0.300323, 0.673508, 0.669983, -0.551453, 1.213074, 1.449326, -1.567261], [0.03743, 0.550293, 0.400909, -0.174622, 0.355301, 1.325867, 0.875854, 0.126953], [-0.084885, 1.128906, 0.292099, -0.248779, 0.722961, 0.873871, -0.409515, 0.470581], [0.019684, 0.947754, 0.19931, -0.306274, 0.176849, 1.431702, 1.091507, 0.701416], [-0.094162, 0.895203, 0.687378, -0.229065, 0.549088, 1.376953, 0.892303, -0.642334], [-0.727692, 0.626495, 0.848877, 0.521362, 1.521912, -0.443481, 1.247238, 0.197388], [-0.82048, 0.117279, 0.975174, 1.487244, 1.085281, -0.567993, 0.776093, -0.381592], [-0.009827, -0.553009, -0.213135, 0.837341, 0.482712, -0.939423, 0.140884, 0.330566], [-0.018127, -1.362335, -0.199265, 1.260742, 0.005188, -1.445068, -1.159653, 1.220825], [0.186172, -1.727814, -0.246552, 1.544128, 0.285416, 0.081848, -1.634003, -0.47522], [0.193649, -1.144043, -0.334854, 1.220276, 1.241302, 1.554382, 0.57048, -1.334961], [0.344604, -1.647461, -0.720749, 0.993774, 0.585709, 0.953522, -0.493042, -1.845703], [0.37471, -1.989471, -0.518555, 0.555908, -0.025787, 0.148132, -1.463425, -0.844849], [0.34523, -1.821625, -0.809418, 0.59137, -0.577927, 0.037903, -2.067764, -0.519531], [0.413193, -1.503876, -0.752243, 0.280396, -0.236206, 0.429932, -1.684097, -0.724731], [0.331299, -1.349243, -0.890121, -0.178589, -0.285721, 0.809875, -2.012329, -0.157227], [0.278946, -1.090057, -0.670441, -0.477539, -0.267105, 0.446045, -1.95668, 0.501343], [0.127304, -0.977112, -0.660324, -1.011658, -0.547409, 0.349182, -1.357574, 1.045654], [0.217728, -0.793182, -0.496262, -1.259949, -0.128937, 0.38855, -1.513306, 1.863647], [0.240143, -0.891541, -0.619995, -1.478577, -0.361481, 0.258362, -1.630585, 1.841064], [0.241547, -0.758453, -0.515442, -1.370605, -0.428238, 0.23996, -1.469406, 1.307617], [0.289948, -0.714661, -0.533798, -1.574036, 0.017929, -0.368317, -1.290283, 0.851563], [0.304916, -0.783752, -0.459915, -1.523621, -0.107651, -0.027649, -1.089905, 0.969238], [0.27179, -0.795593, -0.352432, -1.597656, -0.001678, -0.06189, -1.072495, 0.637329], [0.301956, -0.823578, -0.152115, -1.637634, 0.2034, -0.214508, -1.315491, 0.773071], [0.282486, -0.853271, -0.162094, -1.561096, 0.15686, -0.289307, -1.076874, 0.673706], [0.299881, -0.97052, -0.051086, -1.431152, -0.074692, -0.32428, -1.385452, 0.684326], [0.220886, -1.072266, -0.269531, -1.038269, 0.140533, -0.711273, -1.7453, 1.090332], [0.177628, -1.229126, -0.274292, -0.943481, 0.483246, -1.214447, -2.026321, 0.719971], [0.176987, -1.137543, -0.007645, -0.794861, 0.965118, -1.084717, -2.37677, 0.598267], [0.135727, -1.36795, 0.09462, -0.776367, 0.946655, -1.157959, -2.794403, 0.226074], [0.067337, -1.648987, 0.535721, -0.665833, 1.506119, -1.348755, -3.092728, 0.281616], [-0.038101, -1.437347, 0.983917, -0.280762, 1.880722, -1.351318, -3.002258, -0.599609], [-0.152573, -1.146027, 0.717545, -0.60321, 2.126541, -0.59198, -2.282028, -1.048584], [-0.113525, -0.629669, 0.925323, 0.465393, 2.368698, -0.352661, -1.969391, -0.915161], [-0.140121, -0.311951, 0.884262, 0.809021, 1.557693, -0.552429, -1.776062, -0.925537], [-0.189423, -0.117767, 0.975174, 1.595032, 1.284485, -0.698639, -2.007202, -1.307251], [-0.048874, -0.176941, 0.820679, 1.306519, 0.584259, -0.913147, -0.658066, -0.630981], [-0.127594, 0.33313, 0.791336, 1.400696, 0.685577, -1.500275, -0.657959, -0.207642], [-0.044128, 0.653351, 0.615326, 0.476685, 1.099625, -0.902893, -0.154449, 0.325073], [-0.150223, 1.059845, 1.208405, -0.038635, 0.758667, 0.458038, -0.178909, -0.998657], [-0.099854, 1.127197, 0.789871, -0.013611, 0.452805, 0.736176, 0.948273, -0.236328], [-0.250275, 1.188568, 0.935989, 0.34314, 0.130463, 0.879913, 1.669037, 0.12793], [-0.122818, 1.441223, 0.670029, 0.389526, -0.15274, 1.293549, 1.22908, -1.132568]]
#this one doesn't
reference = [[-0.453598, -2.439209, 0.973587, 1.362091, -0.073654, -1.755112, 1.090057, 4.246765], [-0.448502, -2.621201, 0.723282, 1.257324, 0.26619, -1.375351, 1.328735, 4.46991], [-0.481247, -2.29718, 0.612854, 1.078033, 0.309708, -2.037506, 1.056305, 3.181702], [-0.42482, -2.306702, 0.436157, 1.529907, 0.50708, -1.930069, 0.653198, 3.561768], [-0.39032, -2.361343, 0.589294, 1.965607, 0.611801, -2.417084, 0.035675, 3.381104], [-0.233444, -2.281525, 0.703171, 2.17868, 0.519257, -2.474442, -0.502808, 3.569153], [-0.174652, -1.924591, 0.180267, 2.127075, 0.250626, -2.208527, -0.396591, 2.565552], [-0.121078, -1.53801, 0.234344, 2.221039, 0.845367, -1.516205, -0.174149, 1.298645], [-0.18631, -1.047806, 0.629654, 2.073303, 0.775024, -1.931076, 0.382706, 2.278442], [-0.160477, -0.78743, 0.694214, 1.917572, 0.834885, -1.574707, 0.780045, 2.370422], [-0.203659, -0.427246, 0.726486, 1.548767, 0.465698, -1.185379, 0.555206, 2.619629], [-0.208298, -0.393707, 0.771881, 1.646484, 0.612946, -0.996277, 0.658539, 2.499146], [-0.180679, -0.166656, 0.689209, 1.205994, 0.3918, -1.051483, 0.771072, 1.854553], [-0.1978, 0.082764, 0.723541, 1.019104, 0.165405, -0.127533, 1.0522, 0.552368], [-0.171127, 0.168533, 0.529541, 0.584839, 0.702011, -0.36525, 0.711792, 1.029114], [-0.224243, 0.38765, 0.916031, 0.45108, 0.708923, -0.059326, 1.016312, 0.437561], [-0.217072, -0.981766, 1.67363, 1.864014, 0.050812, -2.572815, -0.22937, 0.757996], [-0.284714, -0.784927, 1.720383, 1.782379, -0.093414, -2.492111, 0.623398, 0.629028], [-0.261169, -0.427979, 1.680038, 1.585358, 0.067093, -1.8181, 1.276291, 0.838989], [-0.183075, -0.08197, 1.094147, 1.120392, -0.117752, -0.86142, 1.94194, 0.966858], [-0.188919, 0.121521, 1.277664, 0.90979, 0.114288, -0.880875, 1.920517, 0.95752], [-0.226868, 0.338455, 0.78067, 0.803009, 0.347092, -0.387955, 0.641296, 0.374634], [-0.206329, 0.768158, 0.759537, 0.264099, 0.15979, 0.152618, 0.911636, -0.011597], [-0.230453, 0.495941, 0.547165, 0.137604, 0.36377, 0.594406, 1.168839, 0.125916], [0.340851, -0.382736, -1.060455, -0.267792, 1.1306, 0.595047, -1.544922, -1.6828], [0.341492, -0.325836, -1.07164, -0.215607, 0.895645, 0.400177, -0.773956, -1.827515], [0.392075, -0.305389, -0.885422, -0.293427, 0.993225, 0.66655, -1.061218, -1.730713], [0.30191, -0.339005, -0.877853, 0.153992, 0.986588, 0.711823, -1.100525, -1.648376], [0.303574, -0.491241, -1.000183, 0.075378, 0.686295, 0.752792, -1.192123, -1.744568], [0.315781, -0.629456, -0.996063, 0.224731, 1.074173, 0.757736, -1.170807, -2.08313], [0.313675, -0.804688, -1.00325, 0.431641, 0.685883, 0.538879, -0.988373, -2.421326], [0.267181, -0.790329, -0.726974, 0.853027, 1.369629, -0.213638, -1.708023, -1.977844], [0.304459, -0.935257, -0.778061, 1.042633, 1.391861, -0.296768, -1.562164, -2.014099], [0.169754, -0.792953, -0.481842, 1.404236, 0.766983, -0.29805, -1.587265, -1.25531], [0.15918, -0.9814, -0.197662, 1.748718, 0.888367, -0.880234, -1.64949, -1.359802], [0.028244, -0.772934, -0.186172, 1.594238, 0.863571, -1.224701, -1.153183, -0.292664], [-0.020401, -0.461578, 0.368088, 1.000366, 1.079636, -0.389603, -0.144409, 0.651733], [0.018555, -0.725418, 0.632599, 1.707336, 0.535049, -1.783859, -0.916122, 1.557007], [-0.038971, -0.797668, 0.820419, 1.483093, 0.350494, -1.465073, -0.786453, 1.370361], [-0.244888, -0.469513, 1.067978, 1.028809, 0.4879, -1.796585, -0.77887, 1.888977], [-0.260193, -0.226593, 1.141754, 1.21228, 0.214005, -1.200943, -0.441177, 0.532715], [-0.165283, 0.016129, 1.263016, 0.745514, -0.211288, -0.802368, 0.215698, 0.316406], [-0.353134, 0.053787, 1.544189, 0.21106, -0.469086, -0.485367, 0.767761, 0.849548], [-0.330215, 0.162704, 1.570053, 0.304718, -0.561172, -0.410294, 0.895126, 0.858093], [-0.333847, 0.173904, 1.56958, 0.075531, -0.5569, -0.259552, 1.276764, 0.749084], [-0.347107, 0.206665, 1.389832, 0.50473, -0.721664, -0.56955, 1.542618, 0.817444], [-0.299057, 0.140244, 1.402924, 0.215363, -0.62767, -0.550461, 1.60788, 0.506958], [-0.292084, 0.052063, 1.463348, 0.290497, -0.462875, -0.497452, 1.280609, 0.261841], [-0.279877, 0.183548, 1.308609, 0.305756, -0.6483, -0.374771, 1.647781, 0.161865], [-0.28389, 0.27916, 1.148636, 0.466736, -0.724442, -0.21991, 1.819901, -0.218872], [-0.275528, 0.309753, 1.192856, 0.398163, -0.828781, -0.268066, 1.763672, 0.116089], [-0.275284, 0.160019, 1.200623, 0.718628, -0.925552, -0.026596, 1.367447, 0.174866], [-0.302795, 0.383438, 1.10556, 0.441833, -0.968323, -0.137375, 1.851791, 0.357971], [-0.317078, 0.22876, 1.272217, 0.462219, -0.855789, -0.294296, 1.593994, 0.127502], [-0.304932, 0.207718, 1.156189, 0.481506, -0.866776, -0.340027, 1.670105, 0.657837], [-0.257217, 0.155655, 1.041428, 0.717926, -0.761597, -0.17244, 1.114151, 0.653503], [-0.321426, 0.292358, 0.73848, 0.422607, -0.850754, -0.057907, 1.462357, 0.697754], [-0.34642, 0.361526, 0.69722, 0.585175, -0.464508, -0.26651, 1.860596, 0.106201], [-0.339844, 0.584229, 0.542603, 0.184937, -0.341263, 0.085648, 1.837311, 0.160461], [-0.32338, 0.661224, 0.512833, 0.319702, -0.195572, 0.004028, 1.046799, 0.233704], [-0.346329, 0.572388, 0.385986, 0.118988, 0.057556, 0.039001, 1.255081, -0.18573], [-0.383392, 0.558395, 0.553391, -0.358612, 0.443573, -0.086014, 0.652878, 0.829956], [-0.420395, 0.668991, 0.64856, -0.021271, 0.511475, 0.639221, 0.860474, 0.463196], [-0.359039, 0.748672, 0.522964, -0.308899, 0.717194, 0.218811, 0.681396, 0.606812], [-0.323914, 0.942627, 0.249069, -0.418365, 0.673599, 0.797974, 0.162674, 0.120361], [-0.411301, 0.92775, 0.493332, -0.286346, 0.165054, 0.63446, 1.085571, 0.120789], [-0.346191, 0.632309, 0.635056, -0.402496, 0.143814, 0.785614, 0.952164, 0.482727], [-0.203812, 0.789261, 0.240433, -0.47699, -0.12912, 0.91832, 1.145493, 0.052002], [-0.048203, 0.632095, 0.009583, -0.53833, 0.232727, 1.293045, 0.308151, 0.188904], [-0.062393, 0.732315, 0.06694, -0.697144, 0.126221, 0.864578, 0.581635, -0.088379]]
query = [[-0.113144, -3.316223, -1.101563, -2.128418, 1.853867, 3.61972, 1.218185, 1.71228], [-0.128952, -3.37915, -1.152237, -2.033081, 1.860199, 4.008179, 0.445938, 1.665894], [-0.0392, -2.976654, -0.888245, -1.613953, 1.638641, 3.849518, 0.034073, 0.768188], [-0.146042, -2.980713, -1.044113, -1.44397, 0.954514, 3.20929, -0.232422, 1.050781], [-0.155029, -2.997192, -1.064438, -1.369873, 0.67688, 2.570709, -0.855347, 1.523438], [-0.102341, -2.686401, -1.029648, -1.00531, 0.950089, 1.933228, -0.526367, 1.598633], [-0.060272, -2.538727, -1.278259, -0.65332, 0.630875, 1.459717, -0.264038, 1.872925], [0.064087, -2.592682, -1.112823, -0.775024, 0.848618, 0.810883, 0.298965, 2.312134], [0.111557, -2.815277, -1.203506, -1.173584, 0.54863, 0.46756, -0.023071, 3.029053], [0.266068, -2.624786, -1.089066, -0.864136, 0.055389, 0.619446, -0.160965, 2.928589], [0.181488, -2.31073, -1.307785, -0.720276, 0.001297, 0.534668, 0.495499, 2.989502], [0.216202, -2.25354, -1.288193, -0.902039, -0.152283, -0.060791, 0.566315, 2.911621], [0.430084, -2.0289, -1.099594, -1.091736, -0.302505, -0.087799, 0.955963, 2.677002], [0.484253, -1.412842, -0.881882, -1.087158, -1.064072, -0.145935, 1.437683, 2.606567], [0.339081, -1.277222, -1.24498, -1.048279, -0.219498, 0.448517, 1.168625, 0.563843], [0.105728, 0.138275, -1.01413, -0.489868, 1.319275, 1.604645, 1.634003, -0.94812], [-0.209061, 1.025665, 0.180405, 0.955566, 1.527405, 0.91745, 1.951233, -0.40686], [-0.136993, 1.332275, 0.639862, 1.277832, 1.277313, 0.361267, 0.390717, -0.728394], [-0.217758, 1.416718, 1.080002, 0.816101, 0.343933, -0.154175, 1.10347, -0.568848]]
reference = np.array( reference )
query = np.array( query )
rpy2.robjects.numpy2ri.activate()
# Set up our R namespaces
R = rpy2.robjects.r
rNull = R("NULL")
rprint = rpy2.robjects.globalenv.get("print")
rplot = rpy2.robjects.r('plot')
distConstr = rpy2.robjects.r('proxy::dist')
DTW = importr('dtw')
stepName = "asymmetricP05"
stepPattern = rpy2.robjects.r( stepName )
canDist = distConstr( reference, query, "Euclidean" ) #
alignment = R.dtw(canDist, rNull, "Euclidean", stepPattern, "none", True, False, True, False )
For some series the script doesn't generate the error but there are some which do. See the commented lines for examples. It is worth noting that for the classic constraint this error does not appear. I am thinking that perhaps I have not set-up something correct but I am no expert in python nor in R so that is why I was hoping that others who have used the R DTW can help me on this. I am sorry for the long lines for reference and query (the data is from outputting the MFCC's of a 2 second wav file).

One of the two series is too short to be compatible with the fancy step pattern you chose. Use the common symmetric2 pattern, which does not restrict slopes, before the more exotic ones.

Related

TypeError when fitting Statsmodels OLS with standard errors clustered 2 ways

Context
Building on top of How to run Panel OLS regressions with 3+ fixed-effect and errors clustering? and notably Josef's third comment, I am trying to adapt the OLS Coefficients and Standard Errors Clustered by Firm and Year section of this example notebook below:
cluster_2ways_ols = sm.ols(formula='y ~ x', data=df).fit(cov_type='cluster',
cov_kwds={'groups': np.array(df[['firmid', 'year']])},
use_t=True)
to my own example dataset.
Note that I am able to reproduce this example (and it works). I can also add fixed-effects, by using 'y ~ x + C(firmid) + C(year)' as formula instead.
Problem
However, trying to port the same command to my example dataset (see code below), I'm getting the following error:
>>> model = sm.OLS.from_formula("gdp ~ population + C(year_publication) + C(country)", df)
>>> result = model.fit(
cov_type='cluster',
cov_kwds={'groups': np.array(df[['country', 'year_publication']])},
use_t=True
)
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/path/venv/lib64/python3.10/site-packages/statsmodels/regression/linear_model.py", line 343, in fit
lfit = OLSResults(
File "/path/venv/lib64/python3.10/site-packages/statsmodels/regression/linear_model.py", line 1607, in __init__
self.get_robustcov_results(cov_type=cov_type, use_self=True,
File "/path/venv/lib64/python3.10/site-packages/statsmodels/regression/linear_model.py", line 2568, in get_robustcov_results
res.cov_params_default = sw.cov_cluster_2groups(
File "/path/venv/lib64/python3.10/site-packages/statsmodels/stats/sandwich_covariance.py", line 591, in cov_cluster_2groups
combine_indices(group)[0],
File "/path/venv/lib64/python3.10/site-packages/statsmodels/tools/grouputils.py", line 55, in combine_indices
groups_ = groups.view([('', groups.dtype)] * groups.shape[1])
File "/path/venv/lib64/python3.10/site-packages/numpy/core/_internal.py", line 549, in _view_is_safe
raise TypeError("Cannot change data-type for object array.")
TypeError: Cannot change data-type for object array.
I have tried to manually cast the year_publication to string/object using np.array(df[['country', 'year_publication']].astype("str")), but it doesn't solve the issue.
Questions
What is the cause of the TypeError()?
How to adapt the example command to my dataset?
Minimal Working Example
from io import StringIO
import numpy as np
import pandas as pd
import statsmodels.api as sm
DATA = """
"continent","country","source","year_publication","year_data","population","gdp"
"Africa","Angola","OECD",2020,2018,972,52.69
"Africa","Angola","OECD",2020,2019,986,802.7
"Africa","Angola","OECD",2020,2020,641,568.74
"Africa","Angola","OECD",2021,2018,438,168.83
"Africa","Angola","OECD",2021,2019,958,310.57
"Africa","Angola","OECD",2021,2020,270,144.02
"Africa","Angola","OECD",2022,2018,528,359.71
"Africa","Angola","OECD",2022,2019,974,582.98
"Africa","Angola","OECD",2022,2020,835,820.49
"Africa","Angola","IMF",2020,2018,168,148.85
"Africa","Angola","IMF",2020,2019,460,236.21
"Africa","Angola","IMF",2020,2020,360,297.15
"Africa","Angola","IMF",2021,2018,381,249.13
"Africa","Angola","IMF",2021,2019,648,128.05
"Africa","Angola","IMF",2021,2020,206,179.05
"Africa","Angola","IMF",2022,2018,282,150.29
"Africa","Angola","IMF",2022,2019,125,23.42
"Africa","Angola","IMF",2022,2020,410,247.35
"Africa","Angola","WorldBank",2020,2018,553,182.06
"Africa","Angola","WorldBank",2020,2019,847,698.87
"Africa","Angola","WorldBank",2020,2020,844,126.61
"Africa","Angola","WorldBank",2021,2018,307,239.76
"Africa","Angola","WorldBank",2021,2019,659,510.73
"Africa","Angola","WorldBank",2021,2020,548,331.89
"Africa","Angola","WorldBank",2022,2018,448,122.76
"Africa","Angola","WorldBank",2022,2019,768,761.41
"Africa","Angola","WorldBank",2022,2020,324,163.57
"Africa","Benin","OECD",2020,2018,513,196.9
"Africa","Benin","OECD",2020,2019,590,83.7
"Africa","Benin","OECD",2020,2020,791,511.09
"Africa","Benin","OECD",2021,2018,799,474.43
"Africa","Benin","OECD",2021,2019,455,234.21
"Africa","Benin","OECD",2021,2020,549,238.83
"Africa","Benin","OECD",2022,2018,235,229.33
"Africa","Benin","OECD",2022,2019,347,46.51
"Africa","Benin","OECD",2022,2020,532,392.13
"Africa","Benin","IMF",2020,2018,138,137.05
"Africa","Benin","IMF",2020,2019,978,239.82
"Africa","Benin","IMF",2020,2020,821,33.41
"Africa","Benin","IMF",2021,2018,453,291.93
"Africa","Benin","IMF",2021,2019,526,381.88
"Africa","Benin","IMF",2021,2020,467,313.57
"Africa","Benin","IMF",2022,2018,948,555.23
"Africa","Benin","IMF",2022,2019,323,289.91
"Africa","Benin","IMF",2022,2020,421,62.35
"Africa","Benin","WorldBank",2020,2018,983,271.69
"Africa","Benin","WorldBank",2020,2019,138,23.55
"Africa","Benin","WorldBank",2020,2020,636,623.65
"Africa","Benin","WorldBank",2021,2018,653,534.99
"Africa","Benin","WorldBank",2021,2019,564,368.8
"Africa","Benin","WorldBank",2021,2020,741,312.02
"Africa","Benin","WorldBank",2022,2018,328,292.11
"Africa","Benin","WorldBank",2022,2019,653,429.21
"Africa","Benin","WorldBank",2022,2020,951,242.73
"Africa","Chad","OECD",2020,2018,176,95.06
"Africa","Chad","OECD",2020,2019,783,425.34
"Africa","Chad","OECD",2020,2020,885,461.6
"Africa","Chad","OECD",2021,2018,673,15.87
"Africa","Chad","OECD",2021,2019,131,74.46
"Africa","Chad","OECD",2021,2020,430,61.58
"Africa","Chad","OECD",2022,2018,593,211.34
"Africa","Chad","OECD",2022,2019,647,550.37
"Africa","Chad","OECD",2022,2020,154,105.65
"Africa","Chad","IMF",2020,2018,160,32.41
"Africa","Chad","IMF",2020,2019,654,27.84
"Africa","Chad","IMF",2020,2020,616,468.92
"Africa","Chad","IMF",2021,2018,996,22.4
"Africa","Chad","IMF",2021,2019,126,93.18
"Africa","Chad","IMF",2021,2020,879,547.87
"Africa","Chad","IMF",2022,2018,663,520
"Africa","Chad","IMF",2022,2019,681,544.76
"Africa","Chad","IMF",2022,2020,101,55.6
"Africa","Chad","WorldBank",2020,2018,786,757.22
"Africa","Chad","WorldBank",2020,2019,599,593.69
"Africa","Chad","WorldBank",2020,2020,641,529.84
"Africa","Chad","WorldBank",2021,2018,343,287.89
"Africa","Chad","WorldBank",2021,2019,438,340.83
"Africa","Chad","WorldBank",2021,2020,762,594.67
"Africa","Chad","WorldBank",2022,2018,430,128.69
"Africa","Chad","WorldBank",2022,2019,260,242.59
"Africa","Chad","WorldBank",2022,2020,607,216.1
"Europe","Denmark","OECD",2020,2018,114,86.75
"Europe","Denmark","OECD",2020,2019,937,373.29
"Europe","Denmark","OECD",2020,2020,866,392.93
"Europe","Denmark","OECD",2021,2018,296,41.04
"Europe","Denmark","OECD",2021,2019,402,32.67
"Europe","Denmark","OECD",2021,2020,306,7.88
"Europe","Denmark","OECD",2022,2018,540,379.51
"Europe","Denmark","OECD",2022,2019,108,26.72
"Europe","Denmark","OECD",2022,2020,752,307.2
"Europe","Denmark","IMF",2020,2018,157,24.24
"Europe","Denmark","IMF",2020,2019,303,79.04
"Europe","Denmark","IMF",2020,2020,286,122.36
"Europe","Denmark","IMF",2021,2018,569,69.32
"Europe","Denmark","IMF",2021,2019,808,642.67
"Europe","Denmark","IMF",2021,2020,157,5.58
"Europe","Denmark","IMF",2022,2018,147,112.21
"Europe","Denmark","IMF",2022,2019,414,311.16
"Europe","Denmark","IMF",2022,2020,774,230.46
"Europe","Denmark","WorldBank",2020,2018,695,350.03
"Europe","Denmark","WorldBank",2020,2019,511,209.84
"Europe","Denmark","WorldBank",2020,2020,181,29.27
"Europe","Denmark","WorldBank",2021,2018,503,176.89
"Europe","Denmark","WorldBank",2021,2019,710,609.02
"Europe","Denmark","WorldBank",2021,2020,264,165.78
"Europe","Denmark","WorldBank",2022,2018,670,638.99
"Europe","Denmark","WorldBank",2022,2019,651,354.6
"Europe","Denmark","WorldBank",2022,2020,632,623.94
"Europe","Estonia","OECD",2020,2018,838,263.67
"Europe","Estonia","OECD",2020,2019,638,533.95
"Europe","Estonia","OECD",2020,2020,898,638.73
"Europe","Estonia","OECD",2021,2018,262,98.16
"Europe","Estonia","OECD",2021,2019,569,552.54
"Europe","Estonia","OECD",2021,2020,868,252.48
"Europe","Estonia","OECD",2022,2018,927,264.65
"Europe","Estonia","OECD",2022,2019,205,150.6
"Europe","Estonia","OECD",2022,2020,828,752.61
"Europe","Estonia","IMF",2020,2018,841,176.31
"Europe","Estonia","IMF",2020,2019,614,230.55
"Europe","Estonia","IMF",2020,2020,500,41.19
"Europe","Estonia","IMF",2021,2018,510,169.68
"Europe","Estonia","IMF",2021,2019,765,401.85
"Europe","Estonia","IMF",2021,2020,751,319.6
"Europe","Estonia","IMF",2022,2018,314,58.81
"Europe","Estonia","IMF",2022,2019,155,2.24
"Europe","Estonia","IMF",2022,2020,734,187.6
"Europe","Estonia","WorldBank",2020,2018,332,160.17
"Europe","Estonia","WorldBank",2020,2019,466,385.33
"Europe","Estonia","WorldBank",2020,2020,487,435.06
"Europe","Estonia","WorldBank",2021,2018,461,249.19
"Europe","Estonia","WorldBank",2021,2019,932,763.38
"Europe","Estonia","WorldBank",2021,2020,650,463.91
"Europe","Estonia","WorldBank",2022,2018,570,549.97
"Europe","Estonia","WorldBank",2022,2019,909,80.48
"Europe","Estonia","WorldBank",2022,2020,523,242.22
"Europe","Finland","OECD",2020,2018,565,561.64
"Europe","Finland","OECD",2020,2019,646,161.62
"Europe","Finland","OECD",2020,2020,194,133.69
"Europe","Finland","OECD",2021,2018,529,39.76
"Europe","Finland","OECD",2021,2019,800,680.12
"Europe","Finland","OECD",2021,2020,418,399.19
"Europe","Finland","OECD",2022,2018,591,253.12
"Europe","Finland","OECD",2022,2019,457,272.58
"Europe","Finland","OECD",2022,2020,157,105.1
"Europe","Finland","IMF",2020,2018,860,445.03
"Europe","Finland","IMF",2020,2019,108,47.72
"Europe","Finland","IMF",2020,2020,523,500.58
"Europe","Finland","IMF",2021,2018,560,81.47
"Europe","Finland","IMF",2021,2019,830,664.64
"Europe","Finland","IMF",2021,2020,903,762.62
"Europe","Finland","IMF",2022,2018,179,167.73
"Europe","Finland","IMF",2022,2019,137,98.98
"Europe","Finland","IMF",2022,2020,666,524.86
"Europe","Finland","WorldBank",2020,2018,319,146.01
"Europe","Finland","WorldBank",2020,2019,401,219.56
"Europe","Finland","WorldBank",2020,2020,711,45.35
"Europe","Finland","WorldBank",2021,2018,828,20.97
"Europe","Finland","WorldBank",2021,2019,180,66.3
"Europe","Finland","WorldBank",2021,2020,682,92.57
"Europe","Finland","WorldBank",2022,2018,254,81.2
"Europe","Finland","WorldBank",2022,2019,619,159.08
"Europe","Finland","WorldBank",2022,2020,191,184.4
"""
df = pd.read_csv(StringIO(DATA))
model = sm.OLS.from_formula("gdp ~ population + C(year_publication) + C(country)", df)
result = model.fit(
cov_type='cluster',
cov_kwds={'groups': np.array(df[['country', 'year_publication']])},
use_t=True
)
print(result.summary())
I have realized that the groups must be an array of integers rather than of objects/strings.
Thus, label encoding the string column as follows:
df["country"] = df["country"].astype("category")
df["country_id"] = df.country.cat.codes
and using country_id to cluster the standard errors solves the issue:
result = model.fit(
cov_type='cluster',
cov_kwds={'groups': np.array(df[['country_id', 'year_publication']])},
use_t=True
)
Fully working example:
from io import StringIO
import numpy as np
import pandas as pd
import statsmodels.api as sm
DATA = """
"continent","country","source","year_publication","year_data","population","gdp"
"Africa","Angola","OECD",2020,2018,972,52.69
"Africa","Angola","OECD",2020,2019,986,802.7
"Africa","Angola","OECD",2020,2020,641,568.74
"Africa","Angola","OECD",2021,2018,438,168.83
"Africa","Angola","OECD",2021,2019,958,310.57
"Africa","Angola","OECD",2021,2020,270,144.02
"Africa","Angola","OECD",2022,2018,528,359.71
"Africa","Angola","OECD",2022,2019,974,582.98
"Africa","Angola","OECD",2022,2020,835,820.49
"Africa","Angola","IMF",2020,2018,168,148.85
"Africa","Angola","IMF",2020,2019,460,236.21
"Africa","Angola","IMF",2020,2020,360,297.15
"Africa","Angola","IMF",2021,2018,381,249.13
"Africa","Angola","IMF",2021,2019,648,128.05
"Africa","Angola","IMF",2021,2020,206,179.05
"Africa","Angola","IMF",2022,2018,282,150.29
"Africa","Angola","IMF",2022,2019,125,23.42
"Africa","Angola","IMF",2022,2020,410,247.35
"Africa","Angola","WorldBank",2020,2018,553,182.06
"Africa","Angola","WorldBank",2020,2019,847,698.87
"Africa","Angola","WorldBank",2020,2020,844,126.61
"Africa","Angola","WorldBank",2021,2018,307,239.76
"Africa","Angola","WorldBank",2021,2019,659,510.73
"Africa","Angola","WorldBank",2021,2020,548,331.89
"Africa","Angola","WorldBank",2022,2018,448,122.76
"Africa","Angola","WorldBank",2022,2019,768,761.41
"Africa","Angola","WorldBank",2022,2020,324,163.57
"Africa","Benin","OECD",2020,2018,513,196.9
"Africa","Benin","OECD",2020,2019,590,83.7
"Africa","Benin","OECD",2020,2020,791,511.09
"Africa","Benin","OECD",2021,2018,799,474.43
"Africa","Benin","OECD",2021,2019,455,234.21
"Africa","Benin","OECD",2021,2020,549,238.83
"Africa","Benin","OECD",2022,2018,235,229.33
"Africa","Benin","OECD",2022,2019,347,46.51
"Africa","Benin","OECD",2022,2020,532,392.13
"Africa","Benin","IMF",2020,2018,138,137.05
"Africa","Benin","IMF",2020,2019,978,239.82
"Africa","Benin","IMF",2020,2020,821,33.41
"Africa","Benin","IMF",2021,2018,453,291.93
"Africa","Benin","IMF",2021,2019,526,381.88
"Africa","Benin","IMF",2021,2020,467,313.57
"Africa","Benin","IMF",2022,2018,948,555.23
"Africa","Benin","IMF",2022,2019,323,289.91
"Africa","Benin","IMF",2022,2020,421,62.35
"Africa","Benin","WorldBank",2020,2018,983,271.69
"Africa","Benin","WorldBank",2020,2019,138,23.55
"Africa","Benin","WorldBank",2020,2020,636,623.65
"Africa","Benin","WorldBank",2021,2018,653,534.99
"Africa","Benin","WorldBank",2021,2019,564,368.8
"Africa","Benin","WorldBank",2021,2020,741,312.02
"Africa","Benin","WorldBank",2022,2018,328,292.11
"Africa","Benin","WorldBank",2022,2019,653,429.21
"Africa","Benin","WorldBank",2022,2020,951,242.73
"Africa","Chad","OECD",2020,2018,176,95.06
"Africa","Chad","OECD",2020,2019,783,425.34
"Africa","Chad","OECD",2020,2020,885,461.6
"Africa","Chad","OECD",2021,2018,673,15.87
"Africa","Chad","OECD",2021,2019,131,74.46
"Africa","Chad","OECD",2021,2020,430,61.58
"Africa","Chad","OECD",2022,2018,593,211.34
"Africa","Chad","OECD",2022,2019,647,550.37
"Africa","Chad","OECD",2022,2020,154,105.65
"Africa","Chad","IMF",2020,2018,160,32.41
"Africa","Chad","IMF",2020,2019,654,27.84
"Africa","Chad","IMF",2020,2020,616,468.92
"Africa","Chad","IMF",2021,2018,996,22.4
"Africa","Chad","IMF",2021,2019,126,93.18
"Africa","Chad","IMF",2021,2020,879,547.87
"Africa","Chad","IMF",2022,2018,663,520
"Africa","Chad","IMF",2022,2019,681,544.76
"Africa","Chad","IMF",2022,2020,101,55.6
"Africa","Chad","WorldBank",2020,2018,786,757.22
"Africa","Chad","WorldBank",2020,2019,599,593.69
"Africa","Chad","WorldBank",2020,2020,641,529.84
"Africa","Chad","WorldBank",2021,2018,343,287.89
"Africa","Chad","WorldBank",2021,2019,438,340.83
"Africa","Chad","WorldBank",2021,2020,762,594.67
"Africa","Chad","WorldBank",2022,2018,430,128.69
"Africa","Chad","WorldBank",2022,2019,260,242.59
"Africa","Chad","WorldBank",2022,2020,607,216.1
"Europe","Denmark","OECD",2020,2018,114,86.75
"Europe","Denmark","OECD",2020,2019,937,373.29
"Europe","Denmark","OECD",2020,2020,866,392.93
"Europe","Denmark","OECD",2021,2018,296,41.04
"Europe","Denmark","OECD",2021,2019,402,32.67
"Europe","Denmark","OECD",2021,2020,306,7.88
"Europe","Denmark","OECD",2022,2018,540,379.51
"Europe","Denmark","OECD",2022,2019,108,26.72
"Europe","Denmark","OECD",2022,2020,752,307.2
"Europe","Denmark","IMF",2020,2018,157,24.24
"Europe","Denmark","IMF",2020,2019,303,79.04
"Europe","Denmark","IMF",2020,2020,286,122.36
"Europe","Denmark","IMF",2021,2018,569,69.32
"Europe","Denmark","IMF",2021,2019,808,642.67
"Europe","Denmark","IMF",2021,2020,157,5.58
"Europe","Denmark","IMF",2022,2018,147,112.21
"Europe","Denmark","IMF",2022,2019,414,311.16
"Europe","Denmark","IMF",2022,2020,774,230.46
"Europe","Denmark","WorldBank",2020,2018,695,350.03
"Europe","Denmark","WorldBank",2020,2019,511,209.84
"Europe","Denmark","WorldBank",2020,2020,181,29.27
"Europe","Denmark","WorldBank",2021,2018,503,176.89
"Europe","Denmark","WorldBank",2021,2019,710,609.02
"Europe","Denmark","WorldBank",2021,2020,264,165.78
"Europe","Denmark","WorldBank",2022,2018,670,638.99
"Europe","Denmark","WorldBank",2022,2019,651,354.6
"Europe","Denmark","WorldBank",2022,2020,632,623.94
"Europe","Estonia","OECD",2020,2018,838,263.67
"Europe","Estonia","OECD",2020,2019,638,533.95
"Europe","Estonia","OECD",2020,2020,898,638.73
"Europe","Estonia","OECD",2021,2018,262,98.16
"Europe","Estonia","OECD",2021,2019,569,552.54
"Europe","Estonia","OECD",2021,2020,868,252.48
"Europe","Estonia","OECD",2022,2018,927,264.65
"Europe","Estonia","OECD",2022,2019,205,150.6
"Europe","Estonia","OECD",2022,2020,828,752.61
"Europe","Estonia","IMF",2020,2018,841,176.31
"Europe","Estonia","IMF",2020,2019,614,230.55
"Europe","Estonia","IMF",2020,2020,500,41.19
"Europe","Estonia","IMF",2021,2018,510,169.68
"Europe","Estonia","IMF",2021,2019,765,401.85
"Europe","Estonia","IMF",2021,2020,751,319.6
"Europe","Estonia","IMF",2022,2018,314,58.81
"Europe","Estonia","IMF",2022,2019,155,2.24
"Europe","Estonia","IMF",2022,2020,734,187.6
"Europe","Estonia","WorldBank",2020,2018,332,160.17
"Europe","Estonia","WorldBank",2020,2019,466,385.33
"Europe","Estonia","WorldBank",2020,2020,487,435.06
"Europe","Estonia","WorldBank",2021,2018,461,249.19
"Europe","Estonia","WorldBank",2021,2019,932,763.38
"Europe","Estonia","WorldBank",2021,2020,650,463.91
"Europe","Estonia","WorldBank",2022,2018,570,549.97
"Europe","Estonia","WorldBank",2022,2019,909,80.48
"Europe","Estonia","WorldBank",2022,2020,523,242.22
"Europe","Finland","OECD",2020,2018,565,561.64
"Europe","Finland","OECD",2020,2019,646,161.62
"Europe","Finland","OECD",2020,2020,194,133.69
"Europe","Finland","OECD",2021,2018,529,39.76
"Europe","Finland","OECD",2021,2019,800,680.12
"Europe","Finland","OECD",2021,2020,418,399.19
"Europe","Finland","OECD",2022,2018,591,253.12
"Europe","Finland","OECD",2022,2019,457,272.58
"Europe","Finland","OECD",2022,2020,157,105.1
"Europe","Finland","IMF",2020,2018,860,445.03
"Europe","Finland","IMF",2020,2019,108,47.72
"Europe","Finland","IMF",2020,2020,523,500.58
"Europe","Finland","IMF",2021,2018,560,81.47
"Europe","Finland","IMF",2021,2019,830,664.64
"Europe","Finland","IMF",2021,2020,903,762.62
"Europe","Finland","IMF",2022,2018,179,167.73
"Europe","Finland","IMF",2022,2019,137,98.98
"Europe","Finland","IMF",2022,2020,666,524.86
"Europe","Finland","WorldBank",2020,2018,319,146.01
"Europe","Finland","WorldBank",2020,2019,401,219.56
"Europe","Finland","WorldBank",2020,2020,711,45.35
"Europe","Finland","WorldBank",2021,2018,828,20.97
"Europe","Finland","WorldBank",2021,2019,180,66.3
"Europe","Finland","WorldBank",2021,2020,682,92.57
"Europe","Finland","WorldBank",2022,2018,254,81.2
"Europe","Finland","WorldBank",2022,2019,619,159.08
"Europe","Finland","WorldBank",2022,2020,191,184.4
"""
df = pd.read_csv(StringIO(DATA))
df["country"] = df["country"].astype("category")
df["country_id"] = df.country.cat.codes
model = sm.OLS.from_formula("gdp ~ population + C(year_publication) + C(country)", df)
result = model.fit(
cov_type='cluster',
cov_kwds={'groups': np.array(df[['country_id', 'year_publication']])},
use_t=True
)
print(result.summary())

How to add all standard special tokens to my hugging face tokenizer and model?

I want all special tokens to always be available. How do I do this?
My first attempt to give it to my tokenizer:
def does_t5_have_sep_token():
tokenizer: PreTrainedTokenizerFast = AutoTokenizer.from_pretrained('t5-small')
assert isinstance(tokenizer, PreTrainedTokenizerFast)
print(tokenizer)
print(f'{len(tokenizer)=}')
# print(f'{tokenizer.all_special_tokens=}')
print(f'{tokenizer.sep_token=}')
print(f'{tokenizer.eos_token=}')
print(f'{tokenizer.all_special_tokens=}')
special_tokens_dict = {'additional_special_tokens': ['<bos>', '<cls>', '<s>'] + tokenizer.all_special_tokens }
num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
print(f'{tokenizer.sep_token=}')
print(f'{tokenizer.eos_token=}')
print(f'{tokenizer.all_special_tokens=}')
if __name__ == '__main__':
does_t5_have_sep_token()
print('Done\a')
but feels hacky.
refs:
https://github.com/huggingface/tokenizers/issues/247
https://discuss.huggingface.co/t/how-to-add-all-standard-special-tokens-to-my-tokenizer-and-model/21529
seems useful: https://huggingface.co/docs/transformers/v4.21.1/en/main_classes/model#transformers.PreTrainedModel.resize_token_embeddings
I want to add standard tokens by adding the right "standard tokens" the solution provided didn't work for me since the .bos_token is still None. See:
tokenizer.bos_token=None
tokenizer.cls_token=None
tokenizer.sep_token=None
tokenizer.mask_token=None
tokenizer.eos_token='</s>'
tokenizer.unk_token='<unk>'
tokenizer.bos_token_id=None
tokenizer.cls_token_id=None
tokenizer.sep_token_id=None
tokenizer.mask_token_id=None
tokenizer.eos_token_id=1
tokenizer.unk_token_id=2
tokenizer.all_special_tokens=['</s>', '<unk>', '<pad>', '<extra_id_0>', '<extra_id_1>', '<extra_id_2>', '<extra_id_3>', '<extra_id_4>', '<extra_id_5>', '<extra_id_6>', '<extra_id_7>', '<extra_id_8>', '<extra_id_9>', '<extra_id_10>', '<extra_id_11>', '<extra_id_12>', '<extra_id_13>', '<extra_id_14>', '<extra_id_15>', '<extra_id_16>', '<extra_id_17>', '<extra_id_18>', '<extra_id_19>', '<extra_id_20>', '<extra_id_21>', '<extra_id_22>', '<extra_id_23>', '<extra_id_24>', '<extra_id_25>', '<extra_id_26>', '<extra_id_27>', '<extra_id_28>', '<extra_id_29>', '<extra_id_30>', '<extra_id_31>', '<extra_id_32>', '<extra_id_33>', '<extra_id_34>', '<extra_id_35>', '<extra_id_36>', '<extra_id_37>', '<extra_id_38>', '<extra_id_39>', '<extra_id_40>', '<extra_id_41>', '<extra_id_42>', '<extra_id_43>', '<extra_id_44>', '<extra_id_45>', '<extra_id_46>', '<extra_id_47>', '<extra_id_48>', '<extra_id_49>', '<extra_id_50>', '<extra_id_51>', '<extra_id_52>', '<extra_id_53>', '<extra_id_54>', '<extra_id_55>', '<extra_id_56>', '<extra_id_57>', '<extra_id_58>', '<extra_id_59>', '<extra_id_60>', '<extra_id_61>', '<extra_id_62>', '<extra_id_63>', '<extra_id_64>', '<extra_id_65>', '<extra_id_66>', '<extra_id_67>', '<extra_id_68>', '<extra_id_69>', '<extra_id_70>', '<extra_id_71>', '<extra_id_72>', '<extra_id_73>', '<extra_id_74>', '<extra_id_75>', '<extra_id_76>', '<extra_id_77>', '<extra_id_78>', '<extra_id_79>', '<extra_id_80>', '<extra_id_81>', '<extra_id_82>', '<extra_id_83>', '<extra_id_84>', '<extra_id_85>', '<extra_id_86>', '<extra_id_87>', '<extra_id_88>', '<extra_id_89>', '<extra_id_90>', '<extra_id_91>', '<extra_id_92>', '<extra_id_93>', '<extra_id_94>', '<extra_id_95>', '<extra_id_96>', '<extra_id_97>', '<extra_id_98>', '<extra_id_99>']
Using bos_token, but it is not set yet.
Using cls_token, but it is not set yet.
Using sep_token, but it is not set yet.
Using mask_token, but it is not set yet.
code:
def does_t5_have_sep_token():
"""
https://huggingface.co/docs/transformers/v4.21.1/en/main_classes/model#transformers.PreTrainedModel.resize_token_embeddings
"""
import torch
from transformers import AutoModelForSeq2SeqLM
tokenizer: PreTrainedTokenizerFast = AutoTokenizer.from_pretrained('t5-small')
assert isinstance(tokenizer, PreTrainedTokenizerFast)
print(tokenizer)
print(f'{len(tokenizer)=}')
print()
print(f'{tokenizer.sep_token=}')
print(f'{tokenizer.eos_token=}')
print(f'{tokenizer.all_special_tokens=}')
print()
# special_tokens_dict = {'additional_special_tokens': ['<bos>', '<cls>', '<s>'] + tokenizer.all_special_tokens}
# num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
tokenizer.add_tokens([f"_{n}" for n in range(1, 100)], special_tokens=True)
model = AutoModelForSeq2SeqLM.from_pretrained("t5-small")
assert isinstance(model, torch.nn.Module)
model.resize_token_embeddings(len(tokenizer))
# tokenizer.save_pretrained('pathToExtendedTokenizer/')
# tokenizer = T5Tokenizer.from_pretrained("sandbox/t5_models/pretrained/tokenizer/")
print()
print(f'{tokenizer.bos_token=}')
print(f'{tokenizer.cls_token=}')
print(f'{tokenizer.sep_token=}')
print(f'{tokenizer.mask_token=}')
print(f'{tokenizer.eos_token=}')
print(f'{tokenizer.unk_token=}')
print(f'{tokenizer.bos_token_id=}')
print(f'{tokenizer.cls_token_id=}')
print(f'{tokenizer.sep_token_id=}')
print(f'{tokenizer.mask_token_id=}')
print(f'{tokenizer.eos_token_id=}')
print(f'{tokenizer.unk_token_id=}')
print(f'{tokenizer.all_special_tokens=}')
print()
if __name__ == '__main__':
does_t5_have_sep_token()
print('Done\a')
I do not entirely understand what you're trying to accomplish, but here are some notes that might help:
T5 documentation shows that T5 has only three special tokens (</s>, <unk> and <pad>). You can also see this in the T5Tokenizer class definition. I am confident this is because the original T5 model was trained only with these special tokens (no BOS, no MASK, no CLS).
Running, e.g.,
from transformers import AutoTokenizer
tokenizer = AutoTokenizer.from_pretrained('t5-small')
print(tokenizer.all_special_tokens)
will show you these three tokens as well as the <extra_id_*> tokens.
Is there a reason you want the other tokens like BOS?
(Edit - to answer your comments):
(I really think you would benefit from reading the linked documentation at huggingface. The point of a pretrained model is to take advantage of what has already been done. T5 does not use BOS nor CLS in the way you seem to be imagining. Maybe you can get it to work, but IMO it makes more sense to adapt the task you want to solve to the T5 approach)
</s> is the sep token and is already available.
As I understand, for the T5 model, masking (for the sake of ignoring loss) is implemented using attention_mask. On the other hand, if you want to "fill in the blank" then <extra_id> is used to indicate to the model that it should predict the missing token (this is how semi-supervised pretraining is done). See the section on training in the documentation.
BOS is similar - T5 is not trained to use a BOS token. (E.g. (again from documentation),
Note that T5 uses the pad_token_id as the decoder_start_token_id, so
when doing generation without using generate(), make sure you start it
with the pad_token_id.
t5 does not use the CLS token. If you want to do classification, you should finetune a new task (or find a corresponding one done in pretraining), finetuning the model to generate a word (or words) that correspond to the classifications you want.
(again from documentation:)
Build model inputs from a sequence or a pair of sequence for sequence
classification tasks by concatenating and adding special tokens. A
sequence has the following format:
I think this is correct. Please correct me if I'm wrong:
def add_special_all_special_tokens(tokenizer: Union[PreTrainedTokenizer, PreTrainedTokenizerFast]):
"""
special_tokens_dict = {"cls_token": "<CLS>"}
num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
print("We have added", num_added_toks, "tokens")
# Notice: resize_token_embeddings expect to receive the full size of the new vocabulary, i.e., the length of the tokenizer.
model.resize_token_embeddings(len(tokenizer))
assert tokenizer.cls_token == "<CLS>"
"""
original_len: int = len(tokenizer)
num_added_toks: dict = {}
if tokenizer.bos_token is None:
num_added_toks['bos_token'] = "<bos>"
if tokenizer.bos_token is None:
num_added_toks['cls_token'] = "<cls>"
if tokenizer.bos_token is None:
num_added_toks['sep_token'] = "<s>"
if tokenizer.bos_token is None:
num_added_toks['mask_token'] = "<mask>"
# num_added_toks = {"bos_token": "<bos>", "cls_token": "<cls>", "sep_token": "<s>", "mask_token": "<mask>"}
# special_tokens_dict = {'additional_special_tokens': new_special_tokens + tokenizer.all_special_tokens}
num_new_tokens: int = tokenizer.add_special_tokens(num_added_toks)
assert tokenizer.bos_token == "<bos>"
assert tokenizer.cls_token == "<cls>"
assert tokenizer.sep_token == "<s>"
assert tokenizer.mask_token == "<mask>"
msg = f"Error, not equal: {len(tokenizer)=}, {original_len + num_new_tokens=}"
assert len(tokenizer) == original_len + num_new_tokens, msg
left comment from doc that inspired my answer:
def add_special_tokens(self, special_tokens_dict: Dict[str, Union[str, AddedToken]]) -> int:
"""
Add a dictionary of special tokens (eos, pad, cls, etc.) to the encoder and link them to class attributes. If
special tokens are NOT in the vocabulary, they are added to it (indexed starting from the last index of the
current vocabulary).
Note,None When adding new tokens to the vocabulary, you should make sure to also resize the token embedding
matrix of the model so that its embedding matrix matches the tokenizer.
In order to do that, please use the [`~PreTrainedModel.resize_token_embeddings`] method.
Using `add_special_tokens` will ensure your special tokens can be used in several ways:
- Special tokens are carefully handled by the tokenizer (they are never split).
- You can easily refer to special tokens using tokenizer class attributes like `tokenizer.cls_token`. This
makes it easy to develop model-agnostic training and fine-tuning scripts.
When possible, special tokens are already registered for provided pretrained models (for instance
[`BertTokenizer`] `cls_token` is already registered to be :obj*'[CLS]'* and XLM's one is also registered to be
`'</s>'`).
Args:
special_tokens_dict (dictionary *str* to *str* or `tokenizers.AddedToken`):
Keys should be in the list of predefined special attributes: [`bos_token`, `eos_token`, `unk_token`,
`sep_token`, `pad_token`, `cls_token`, `mask_token`, `additional_special_tokens`].
Tokens are only added if they are not already in the vocabulary (tested by checking if the tokenizer
assign the index of the `unk_token` to them).
Returns:
`int`: Number of tokens added to the vocabulary.
Examples:
```python
# Let's see how to add a new classification token to GPT-2
tokenizer = GPT2Tokenizer.from_pretrained("gpt2")
model = GPT2Model.from_pretrained("gpt2")
special_tokens_dict = {"cls_token": "<CLS>"}
num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
print("We have added", num_added_toks, "tokens")
# Notice: resize_token_embeddings expect to receive the full size of the new vocabulary, i.e., the length of the tokenizer.
model.resize_token_embeddings(len(tokenizer))
assert tokenizer.cls_token == "<CLS>"
```"""
it was in hf's tokenization_utils_base.py
I think the right answer is here: https://stackoverflow.com/a/73361984/1601580
Links can be bad answers so here is the code:
def add_special_all_special_tokens(tokenizer: Union[PreTrainedTokenizer, PreTrainedTokenizerFast]):
"""
special_tokens_dict = {"cls_token": "<CLS>"}
num_added_toks = tokenizer.add_special_tokens(special_tokens_dict)
print("We have added", num_added_toks, "tokens")
# Notice: resize_token_embeddings expect to receive the full size of the new vocabulary, i.e., the length of the tokenizer.
model.resize_token_embeddings(len(tokenizer))
assert tokenizer.cls_token == "<CLS>"
"""
original_len: int = len(tokenizer)
num_added_toks: dict = {}
if tokenizer.bos_token is None:
num_added_toks['bos_token'] = "<bos>"
if tokenizer.bos_token is None:
num_added_toks['cls_token'] = "<cls>"
if tokenizer.bos_token is None:
num_added_toks['sep_token'] = "<s>"
if tokenizer.bos_token is None:
num_added_toks['mask_token'] = "<mask>"
# num_added_toks = {"bos_token": "<bos>", "cls_token": "<cls>", "sep_token": "<s>", "mask_token": "<mask>"}
# special_tokens_dict = {'additional_special_tokens': new_special_tokens + tokenizer.all_special_tokens}
num_new_tokens: int = tokenizer.add_special_tokens(num_added_toks)
assert tokenizer.bos_token == "<bos>"
assert tokenizer.cls_token == "<cls>"
assert tokenizer.sep_token == "<s>"
assert tokenizer.mask_token == "<mask>"
err_msg = f"Error, not equal: {len(tokenizer)=}, {original_len + num_new_tokens=}"
assert len(tokenizer) == original_len + num_new_tokens, err_msg
Feedback is always welcomed.

Applying a non aggregating function to a groupby pandas object

I have a dataframe (called cep) with two indexes (Cloud and Mode) and data columns. It looks like this :
The data columns are fitted to a linear function and I'm extracting the residuals to the fit in this way :
import pandas as pd
from scipy.optimize import least_squares
args = [-1, 15] # initial guess for the fit
def residuals(args, x, y):
"""
Residual with respect to a linear function
args : list with 2 arguments
x : array
y : array
"""
return args[0] * x + args[1] - y
def residual_function(df):
"""
Returns the array of the residuals
"""
return least_squares(residuals, args, loss='soft_l1', f_scale=0.5, args=(df.logP1, df.W)).fun
cep.groupby(['Cloud', 'Mode']).apply(lambda grp : residual_function(grp))
This gives the expected result :
Now is my issue : I'd like to insert those residual values each in their respective row in the original dataframe to compare them with other columns.
I checked that the returned arrays are of the right length to be inserted but so far I have no idea how to proceed.
I tried to follow tutorials, but the difference with the textbook problem here is that the function I applied does not aggregate the data. Do you have some hints?
Small sample data here :
Mode;Cloud;W;logP1
F;LMC;14,525;0,4939
F;LMC;13,4954;0,7491
F;LMC;14,5421;0,4249
F;LMC;12,033;1,0215
F;LMC;14,3422;0,5655
F;LMC;13,937;0,6072
F;LMC;13,53;0,737
F;LMC;15,2106;0,2309
F;LMC;14,0813;0,5721
F;LMC;14,5128;0,41
F;LMC;14,1059;0,5469
F;LMC;15,6032;0,1014
F;LMC;13,1088;0,8562
F;LMC;12,3528;1,0513
F;LMC;13,1629;0,8416
F;LMC;14,3114;0,4867
F;LMC;14,4013;0,498
F;LMC;13,5057;0,7131
F;LMC;14,3626;0,464
F;LMC;14,5973;0,4111
F;LMC;13,9286;0,6059
F;LMC;15,066;0,2711
F;LMC;12,7364;0,9466
F;LMC;13,3753;0,7442
F;LMC;13,9748;0,5854
F;LMC;12,8836;0,8946
F;LMC;14,4912;0,4206
F;LMC;14,4131;0,4567
F;LMC;12,183;1,1382
F;LMC;14,5492;0,3686
F;LMC;14,1482;0,5339
F;LMC;13,7062;0,7116
F;LMC;13,0731;0,8682
F;LMC;11,5609;1,353
F;LMC;13,9453;0,5551
F;LMC;14,0072;0,6715
F;LMC;13,9838;0,6021
F;LMC;13,9974;0,5562
F;LMC;14,3898;0,5069
F;LMC;14,4497;0,4433
F;LMC;14,3524;0,5064
F;LMC;12,9604;0,9134
F;LMC;12,9757;0,8548
F;LMC;14,2783;0,4927
F;LMC;13,7148;0,6758
F;LMC;14,2348;0,5142
F;LMC;12,6793;0,9415
F;LMC;14,2241;0,5738
F;LMC;14,472;0,4554
F;LMC;15,1508;0,2076
F;LMC;12,5414;1,0159
F;LMC;14,2102;0,5334
F;LMC;15,6086;0,1116
F;LMC;13,2986;0,8381
F;LMC;13,0136;0,8864
F;LMC;13,9774;0,585
F;LMC;14,4256;0,533
F;LMC;14,3582;0,4578
F;LMC;14,3258;0,4859
F;LMC;14,6646;0,3757
F;LMC;12,733;0,9901
F;LMC;14,6296;0,3839
F;LMC;14,054;0,5766
F;LMC;14,3194;0,4884
F;LMC;12,6602;0,9715
F;LMC;13,5909;0,5675
F;LMC;13,9268;0,6196
F;LMC;12,5813;0,9935
F;LMC;13,0824;0,8591
F;LMC;13,5097;0,7375
F;LMC;13,1938;0,5053
F;LMC;14,7357;0,3253
F;LMC;14,0624;0,6009
F;LMC;14,1528;0,533
F;LMC;14,6709;0,4007
F;LMC;14,2378;0,4875
F;LMC;11,951;1,2004
F;LMC;14,4555;0,4777
F;LMC;14,4001;0,4404
F;LMC;13,7707;0,6311
F;LMC;14,578;0,4175
F;LMC;15,8662;0,0159
F;LMC;14,055;0,5687
F;LMC;13,6238;0,7307
F;LMC;15,2572;0,2171
F;LMC;13,4022;0,7723
F;LMC;14,2392;0,5256
F;LMC;14,2505;0,4977
F;LMC;14,7174;0,3614
F;LMC;14,487;0,418
F;LMC;14,9309;0,3086
F;LMC;13,8352;0,6334
F;LMC;14,5598;0,41
F;LMC;14,5614;0,422
F;LMC;14,1486;0,5149
F;LMC;14,0304;0,4945
F;LMC;13,5781;0,6801
F;LMC;14,79;0,3218
F;LMC;12,376;1,0908
F;LMC;15,3215;0,2176
F;LMC;14,7264;0,3845
F;LMC;14,6276;0,4057
F;LMC;14,1712;0,5313
F;LMC;14,4153;0,483
F;LMC;12,905;0,9356
F;LMC;14,442;0,4309
F;LMC;12,8702;0,9159
F;LMC;12,8963;0,5775
F;LMC;13,8304;0,6467
F;LMC;14,4665;0,4165
F;LMC;13,0756;0,5794
F;LMC;13,841;0,6593
F;LMC;14,0924;0,5671
F;LMC;13,7546;0,6778
F;LMC;14,2828;0,5181
F;LMC;14,2424;0,5082
F;LMC;14,659;0,3989
F;LMC;13,7528;0,6768
F;LMC;13,7743;0,6368
F;LMC;13,2894;0,791
F;LMC;14,7512;0,3187
F;LMC;14,5241;0,4452
F;LMC;14,301;0,5121
F;LMC;13,334;0,7945
F;LMC;13,5052;0,7012
F;LMC;14,3664;0,4549
F;LMC;14,8614;0,3278
F;LMC;13,8612;0,582
F;LMC;14,2668;0,5158
F;LMC;14,3937;0,4457
F;LMC;14,0226;0,582
F;LMC;14,387;0,5565
F;LMC;14,3198;0,4362
F;LMC;14,4404;0,4701
F;LMC;14,2774;0,4939
F;LMC;13,7678;0,6557
F;LMC;14,3212;0,4882
F;LMC;14,6453;0,3696
F;LMC;13,9064;0,6084
F;LMC;13,5167;0,7581
F;LMC;14,1692;0,5134
F;LMC;14,6714;0,4136
F;LMC;14,4332;0,4507
F;LMC;14,705;0,3631
F;LMC;13,6728;0,496
F;LMC;15,358;0,1651
F;LMC;13,7592;0,6278
F;LMC;14,0626;0,5754
F;LMC;13,1127;0,8692
F;LMC;14,2108;0,498
F;LMC;14,4519;0,4449
F;LMC;14,0041;0,5666
F;LMC;14,157;0,5392
F;LMC;14,254;0,5245
F;LMC;15,4844;0,1838
F;LMC;14,0845;0,5626
F;LMC;13,0861;0,838
F;LMC;13,3144;0,831
F;LMC;14,2535;0,4911
F;LMC;14,0256;0,5723
F;LMC;14,3246;0,4938
F;LMC;14,4412;0,4136
F;LMC;14,1043;0,518
F;LMC;14,7512;0,3772
F;LMC;14,3982;0,5039
F;LMC;14,2701;0,5042
F;LMC;13,9166;0,5941
F;LMC;13,0324;0,837
F;LMC;13,4839;0,6331
F;LMC;13,4491;0,7443
F;LMC;14,4702;0,458
F;LMC;14,4814;0,4595
F;LMC;14,3008;0,4575
F;LMC;14,922;0,3313
F;LMC;14,6542;0,4263
F;LMC;14,5007;0,4838
F;LMC;14,4335;0,4829
F;LMC;14,4737;0,4586
F;LMC;14,2537;0,5442
F;LMC;14,038;0,5473
F;LMC;14,1413;0,5523
F;LMC;14,669;0,3505
F;LMC;12,3572;1,1033
F;LMC;13,868;0,6416
F;LMC;13,4292;0,816
F;LMC;11,6771;1,3442
F;LMC;14,5086;0,4654
F;LMC;14,3588;0,4807
F;LMC;14,6915;0,3674
F;LMC;15,6488;0,0647
F;LMC;12,4187;0,9791
F;LMC;14,1555;0,5235
F;LMC;14,5765;0,4281
F;LMC;14,3579;0,4596
F;LMC;13,0932;0,7957
F;LMC;14,4552;0,4216
F;LMC;13,2221;0,8505
F;LMC;14,4465;0,4466
F;LMC;14,2439;0,5032
F;LMC;14,9606;0,6308
F;LMC;14,4774;0,4424
F;LMC;14,1875;0,5361
F;LMC;13,3982;0,7644
F;LMC;13,0973;0,8595
F;LMC;13,8264;0,6334
F;LMC;13,9296;0,6164
F;LMC;14,5778;0,4033
F;LMC;13,579;0,726
F;LMC;14,0054;0,5779
F;LMC;14,1219;0,5451
F;LMC;14,3512;0,4808
F;LMC;14,5058;0,4199
F;LMC;14,598;0,4201
F;LMC;14,9516;0,2498
F;LMC;13,9944;0,6075
F;LMC;13,9462;0,557
F;LMC;14,2576;0,5148
F;LMC;14,9814;0,2929
F;LMC;14,3851;0,4573
F;LMC;14,3474;0,4606
F;LMC;14,4929;0,3882
F;LMC;14,5201;0,4234
F;LMC;13,7677;0,6548
F;LMC;14,3146;0,4695
F;LMC;14,2846;0,507
F;LMC;14,0967;0,5525
F;LMC;14,7976;0,3546
F;LMC;13,7497;0,6362
F;LMC;14,4647;0,4363
F;LMC;14,1924;0,5293
F;LMC;14,588;0,4089
F;LMC;13,4896;0,7329
F;LMC;14,695;0,3737
F;LMC;14,2672;0,4857
F;LMC;14,0784;0,5848
F;LMC;13,879;0,5743
F;LMC;14,2214;0,4988
F;LMC;12,922;0,8487
F;LMC;14,189;0,5238
F;LMC;13,9938;0,5713
F;LMC;14,379;0,4771
F;LMC;11,2308;1,3564
F;LMC;14,4472;0,4205
F;LMC;14,3739;0,4699
F;LMC;14,393;0,4416
F;LMC;13,9108;0,5927
F;LMC;14,0298;0,6058
F;LMC;15,1538;0,1961
F;LMC;13,0393;0,8731
F;LMC;13,7144;0,645
F;LMC;14,2682;0,487
F;LMC;14,3506;0,4927
F;LMC;14,0472;0,5619
F;LMC;15,1418;0,2506
F;LMC;13,1227;0,5998
F;LMC;13,5646;0,7193
F;LMC;14,5872;0,4357
F;LMC;14,2636;0,5007
F;LMC;13,9564;0,5599
F;LMC;12,8576;0,946
F;LMC;12,3042;1,1454
F;LMC;11,8416;1,3675
F;LMC;13,5498;0,7219
F;LMC;12,1976;1,1581
F;LMC;13,8632;0,6202
F;LMC;14,2952;0,4807
F;LMC;14,4349;0,4437
F;LMC;14,2392;0,5445
F;LMC;13,7248;0,7213
F;LMC;14,3395;0,5117
F;LMC;15,3588;0,2253
F;LMC;12,8509;0,9229
F;LMC;15,5192;0,1453
F;LMC;14,2072;0,4975
F;LMC;14,3524;0,4945
F;LMC;14,5152;0,4488
F;LMC;14,5106;0,4558
F;LMC;14,5759;0,3786
F;LMC;11,196;1,2374
F;LMC;14,3736;0,4788
F;LMC;14,1726;0,528
F;LMC;11,7899;1,1995
F;LMC;12,1062;1,1823
F;LMC;13,7113;0,6714
F;LMC;14,3512;0,4815
F;LMC;13,1016;0,8181
F;LMC;14,4968;0,562
F;LMC;12,4557;1,0671
F;LMC;14,0573;0,551
F;LMC;14,5916;0,4066
F;LMC;14,3214;0,488
F;LMC;13,5498;0,4885
F;LMC;14,4679;0,4273
F;LMC;14,2426;0,4816
F;LMC;13,5759;0,7052
F;LMC;14,0081;0,5769
F;LMC;14,0828;0,5379
F;LMC;12,4168;0,7578
F;LMC;14,1624;0,5052
F;LMC;13,8029;0,6621
F;LMC;14,1944;0,5145
F;LMC;13,7944;0,6184
F;LMC;15,0234;0,3158
F;LMC;13,0961;0,8282
F;LMC;13,976;0,5889
F;LMC;14,3236;0,4847
F;LMC;14,2618;0,4691
F;LMC;13,4528;0,7349
F;LMC;14,2846;0,507
F;LMC;14,4115;0,446
F;LMC;14,2199;0,5336
F;LMC;14,456;0,4423
F;LMC;14,2938;0,488
F;LMC;14,4109;0,4606
F;LMC;14,2599;0,497
F;LMC;13,9034;0,6384
F;LMC;13,6126;0,7075
F;LMC;14,5036;0,4218
F;LMC;14,0065;0,5741
F;LMC;14,8622;0,3404
F;LMC;14,635;0,3683
F;LMC;14,222;0,5454
F;LMC;14,1501;0,5548
F;LMC;14,0822;0,5705
F;LMC;13,5036;0,7267
F;LMC;14,5528;0,4161
F;LMC;14,3332;0,4614
F;LMC;14,1511;0,5471
F;LMC;14,6113;0,3934
F;LMC;14,2998;0,5031
F;LMC;14,1807;0,5352
F;LMC;13,5114;0,7013
F;LMC;12,2096;1,1344
F;LMC;14,3799;0,4304
F;LMC;12,4526;1,1135
F;LMC;14,5042;0,447
F;LMC;13,4594;0,7336
F;LMC;13,2066;0,8423
F;LMC;14,3734;0,4711
F;LMC;13,945;0,5953
F;LMC;12,9938;0,8969
F;LMC;13,4993;0,7034
F;LMC;13,9466;0,5678
F;LMC;14,1772;0,5077
F;LMC;13,5566;0,6949
F;LMC;14,021;0,5811
F;LMC;14,0264;0,646
F;LMC;12,0242;1,1666
F;LMC;14,3106;0,5027
F;LMC;14,9838;0,3164
F;LMC;14,1718;0,5266
F;LMC;14,2606;0,489
F;LMC;12,6479;1,0206
F;LMC;12,9768;0,8684
F;LMC;14,0837;0,5785
F;LMC;13,7944;0,6609
F;LMC;13,532;0,6911
F;LMC;14,835;0,3375
F;LMC;13,7378;0,6941
F;LMC;14,3618;0,4658
F;LMC;12,4782;1,0176
F;LMC;14,2216;0,4981
F;LMC;14,3958;0,4917
F;LMC;11,3796;1,3161
F;LMC;13,8073;0,6301
F;LMC;14,414;0,4601
F;LMC;12,4266;1,086
F;LMC;14,7974;0,3547
F;LMC;14,3369;0,5189
F;LMC;14,3202;0,4874
F;LMC;14,4614;0,4664
F;LMC;13,8344;0,6339
F;LMC;14,0452;0,5896
F;LMC;11,9134;1,161
F;LMC;14,2492;0,4891
F;LMC;14,1338;0,5139
F;LMC;14,439;0,4476
F;LMC;14,1446;0,5322
F;LMC;14,102;0,549
F;LMC;14,5043;0,4421
F;LMC;14,388;0,4511
F;LMC;12,3812;1,0331
F;LMC;14,5086;0,4294
F;LMC;13,6822;0,671
F;LMC;12,3012;1,0862
F;LMC;14,0848;0,534
F;LMC;14,3381;0,4886
F;LMC;14,5544;0,3908
F;LMC;14,216;0,5226
F;LMC;14,5028;0,4323
F;LMC;12,7769;0,9244
F;LMC;13,6262;0,6984
F;LMC;14,5276;0,4107
F;LMC;13,921;0,5835
F;LMC;14,6279;0,396
F;LMC;14,6304;0,3796
F;LMC;14,2079;0,4722
F;LMC;12,4538;1,0356
F;LMC;14,2662;0,4876
F;LMC;13,8493;0,6217
F;LMC;12,9806;0,8385
F;LMC;14,3148;0,4768
F;LMC;14,2225;0,49
F;LMC;14,3932;0,4084
F;LMC;13,6934;0,5829
F;LMC;14,1702;0,5297
F;LMC;11,7812;1,2435
F;LMC;14,2866;0,4778
F;LMC;15,2824;0,1739
F;LMC;14,451;0,4485
F;LMC;14,4842;0,4222
F;LMC;14,3422;0,449
F;LMC;14,4408;0,4435
F;LMC;12,527;1,0298
F;LMC;12,3746;1,1016
F;LMC;11,4802;1,3276
F;LMC;14,47;0,4643
F;LMC;14,1469;0,5183
F;SMC;14,423;0,4796
F;SMC;15,5626;0,2344
F;SMC;15,6889;0,236
F;SMC;15,3574;0,2926
F;SMC;15,8049;0,1015
F;SMC;12,9034;0,9993
F;SMC;14,0039;0,6867
F;SMC;15,9834;0,1812
F;SMC;15,7707;0,2028
F;SMC;15,777;0,1735
F;SMC;14,7121;0,4973
F;SMC;13,8691;0,7188
F;SMC;14,889;0,4123
F;SMC;14,5322;0,6233
F;SMC;15,6791;0,331
F;SMC;13,9406;0,7262
F;SMC;13,728;0,8514
F;SMC;15,1952;0,3583
F;SMC;16,0921;0,1397
F;SMC;15,6162;0,1532
F;SMC;15,786;0,2563
F;SMC;16,0774;0,1197
F;SMC;14,4397;0,599
F;SMC;15,8693;0,2072
F;SMC;15,6668;0,2452
F;SMC;15,1954;0,3509
F;SMC;14,1387;0,669
F;SMC;15,6928;0,2125
F;SMC;14,6266;0,5017
F;SMC;15,9557;0,1772
F;SMC;15,607;0,2501
F;SMC;15,9632;0,1629
F;SMC;15,7932;0,2325
F;SMC;15,7108;0,1534
F;SMC;13,037;0,9898
F;SMC;15,3998;0,2915
F;SMC;15,1724;0,3675
F;SMC;13,7222;0,7848
F;SMC;14,8296;0,5222
F;SMC;15,704;0,2407
F;SMC;13,5231;0,8378
F;SMC;14,4338;0,5303
F;SMC;14,6202;0,4843
F;SMC;16,2836;0,0473
F;SMC;15,6011;0,1758
F;SMC;16,0037;0,1571
F;SMC;13,9062;0,6286
F;SMC;16,0606;0,0557
F;SMC;13,2924;0,8905
F;SMC;15,9942;0,1997
F;SMC;15,7766;0,2395
F;SMC;10,8462;1,6309
F;SMC;15,956;0,1425
F;SMC;13,857;0,7079
F;SMC;15,3619;0,2696
F;SMC;14,0064;0,6903
F;SMC;15,6531;0,2602
F;SMC;14,9001;0,5001
F;SMC;14,3957;0,6156
F;SMC;15,4414;0,3174
F;SMC;15,8321;0,1822
F;SMC;16,3562;0,1385
F;SMC;15,8812;0,1651
F;SMC;15,1404;0,408
F;SMC;13,7978;0,8055
F;SMC;15,9291;0,132
F;SMC;15,0555;0,507
F;SMC;15,5766;0,2596
F;SMC;13,6006;0,8469
F;SMC;16,455;0,0629
F;SMC;15,8762;0,1072
F;SMC;16,2856;0,0768
F;SMC;15,8521;0,2129
F;SMC;15,7685;0,2374
F;SMC;16,1197;0,1043
F;SMC;16,0851;0,2333
F;SMC;15,8126;0,1777
F;SMC;14,3891;0,6065
F;SMC;14,6419;0,5446
F;SMC;15,3942;0,3101
F;SMC;15,5785;0,2494
F;SMC;15,661;0,2227
F;SMC;15,9648;0,1405
F;SMC;12,7911;1,0845
F;SMC;15,9351;0,1575
F;SMC;14,1764;0,6864
F;SMC;15,153;0,3624
F;SMC;15,9336;0,1232
F;SMC;15,0124;0,3796
F;SMC;16,1231;0,106
F;SMC;14,4362;0,5306
F;SMC;13,1883;0,8354
F;SMC;15,8972;0,1757
F;SMC;14,1612;0,7287
F;SMC;15,3792;0,2869
F;SMC;16,421;0,0329
F;SMC;14,833;0,4543
F;SMC;14,3997;0,5912
F;SMC;15,8797;0,1747
F;SMC;16,0337;0,1565
F;SMC;15,7371;0,2251
F;SMC;13,954;0,7293
F;SMC;14,1691;0,6695
F;SMC;15,6208;0,2211
F;SMC;14,3416;0,6492
F;SMC;14,6636;0,5423
F;SMC;16,0386;0,1506
F;SMC;14,6578;0,5604
F;SMC;15,6368;0,24
F;SMC;14,843;0,4738
F;SMC;14,9818;0,4869
F;SMC;12,4251;1,1641
F;SMC;15,0727;0,4671
F;SMC;14,1448;0,5949
F;SMC;15,2148;0,3644
F;SMC;15,9372;0,117
F;SMC;15,4336;0,3018
F;SMC;14,5416;0,557
F;SMC;16,4654;0,0436
F;SMC;14,934;0,5498
F;SMC;14,3896;0,695
F;SMC;15,3896;0,3492
F;SMC;15,8122;0,1602
F;SMC;13,7822;0,704
F;SMC;15,7938;0,1679
F;SMC;15,4049;0,3059
F;SMC;16,0742;0,1187
F;SMC;15,704;0,2036
F;SMC;14,9947;0,3748
F;SMC;15,1374;0,4001
F;SMC;13,2254;0,7136
F;SMC;14,3267;0,577
F;SMC;12,7772;1,0317
F;SMC;15,5302;0,3074
F;SMC;16,12;0,1395
F;SMC;15,9826;0,1873
F;SMC;15,9196;0,2025
F;SMC;15,5396;0,2888
F;SMC;14,0063;0,7543
F;SMC;14,6752;0,542
F;SMC;14,3782;0,6365
F;SMC;15,8015;0,2321
F;SMC;15,4898;0,0235
F;SMC;15,6376;0,2499
F;SMC;15,527;0,2697
F;SMC;15,2883;0,3324
F;SMC;15,1014;0,3996
F;SMC;14,435;0,5827
F;SMC;16,1522;0,0832
F;SMC;13,3787;0,8974
F;SMC;16,6258;0,0226
F;SMC;14,0421;0,8043
F;SMC;15,4764;0,2719
F;SMC;14,1377;0,6069
F;SMC;15,3654;0,3461
F;SMC;16,3063;0,0677
F;SMC;15,5912;0,2227
F;SMC;14,555;0,5143
F;SMC;16,2947;0,0824
F;SMC;15,2208;0,3488
F;SMC;16,8052;-0,0287
F;SMC;15,8592;0,1835
F;SMC;15,6349;0,2632
F;SMC;16,522;0,0581
F;SMC;15,7794;0,3351
F;SMC;16,095;0,1574
F;SMC;16,0564;0,1818
F;SMC;16,4614;0,0897
F;SMC;16,1351;0,1332
F;SMC;14,4711;0,5808
F;SMC;13,8768;0,6795
F;SMC;16,2458;0,1273
F;SMC;16,1994;0,0372
F;SMC;15,3434;0,3072
F;SMC;15,5384;0,2442
F;SMC;14,5322;0,5703
F;SMC;15,7762;0,3507
F;SMC;14,3793;0,5628
F;SMC;15,4777;0,3139
F;SMC;15,9216;0,1764
F;SMC;14,3758;0,5278
F;SMC;15,2363;0,3313
F;SMC;14,3224;0,3258
F;SMC;15,2266;0,3656
F;SMC;15,6305;0,174
F;SMC;14,046;0,7832
F;SMC;14,8704;0,507
F;SMC;16,0267;0,2357
F;SMC;16,0671;0,154
F;SMC;13,8434;0,6901
F;SMC;14,4167;0,5992
F;SMC;15,9808;0,125
F;SMC;16,0696;0,1131
F;SMC;15,166;0,166
F;SMC;14,1023;0,6447
F;SMC;13,9666;0,6979
F;SMC;15,64;0,2577
F;SMC;15,6974;0,2429
F;SMC;15,1257;0,3877
F;SMC;15,186;0,3295
F;SMC;14,87;0,4651
F;SMC;16,0943;0,1807
F;SMC;15,7421;0,1809
F;SMC;14,6085;0,5253
F;SMC;14,6912;0,4777
F;SMC;14,1322;0,71
F;SMC;15,3319;0,2937
F;SMC;14,9283;0,4639
F;SMC;15,3753;0,2732
F;SMC;15,0886;0,3989
F;SMC;15,3778;0,3028
F;SMC;16,4933;0,0274
F;SMC;14,7944;0,4336
F;SMC;13,7806;0,7397
F;SMC;14,1895;0,6325
F;SMC;15,947;0,1084
F;SMC;15,9606;0,1665
F;SMC;15,417;0,0976
F;SMC;15,2905;0,3652
F;SMC;14,7712;0,4453
F;SMC;14,6692;0,5412
F;SMC;16,1936;0,0286
F;SMC;15,6136;0,2097
F;SMC;15,8061;0,078
F;SMC;15,3243;0,3385
F;SMC;15,2366;0,3669
F;SMC;16,1653;0,0573
F;SMC;15,916;0,1591
F;SMC;15,2422;0,3216
F;SMC;12,2583;1,2107
F;SMC;15,6361;0,1766
F;SMC;16,0818;0,1771
F;SMC;15,6966;0,2147
F;SMC;16,193;0,0657
F;SMC;14,8256;0,4574
F;SMC;15,7214;0,2185
F;SMC;15,5803;0,2725
F;SMC;14,7322;0,4754
F;SMC;15,8964;0,1898
F;SMC;14,5428;0,4732
F;SMC;16,1362;0,1396
F;SMC;16,2832;0,0473
F;SMC;15,6508;0,2232
F;SMC;14,725;0,4998
F;SMC;16,1585;0,1106
F;SMC;15,2284;0,3727
F;SMC;15,1728;0,3718
F;SMC;14,5354;0,5431
F;SMC;15,8224;0,1256
F;SMC;15,5462;0,2633
F;SMC;14,942;0,455
F;SMC;16,02;0,1426
F;SMC;15,2292;0,2965
F;SMC;14,6639;0,4402
F;SMC;14,887;0,4365
F;SMC;15,8288;0,1924
F;SMC;14,4903;0,5274
F;SMC;15,9464;0,1638
F;SMC;15,8069;0,1999
F;SMC;14,9924;0,3985
F;SMC;15,6917;0,1355
F;SMC;15,5414;0,1628
F;SMC;15,6168;0,2157
F;SMC;15,8006;0,177
F;SMC;14,9294;0,4732
F;SMC;14,5272;0,599
F;SMC;15,7318;0,2691
F;SMC;14,5181;0,5782
F;SMC;15,8524;0,2074
F;SMC;13,773;0,747
F;SMC;15,7608;0,1586
F;SMC;13,947;0,688
F;SMC;14,9774;0,4224
F;SMC;14,5288;0,4912
F;SMC;12,4944;1,2355
F;SMC;13,8683;0,6944
F;SMC;15,7118;0,186
F;SMC;15,7392;0,2081
F;SMC;12,292;1,1395
F;SMC;14,7918;0,4632
F;SMC;15,4428;0,3367
F;SMC;14,7542;0,4279
F;SMC;15,2914;0,3575
F;SMC;14,7332;0,4836
F;SMC;14,566;0,5553
F;SMC;15,9406;0,1167
F;SMC;15,6304;0,2296
F;SMC;14,0478;0,7063
F;SMC;15,5402;0,2821
F;SMC;15,6019;0,2443
F;SMC;15,6554;0,1979
F;SMC;14,7736;0,1631
F;SMC;16,1684;0,119
F;SMC;14,5113;0,5073
F;SMC;15,5466;0,134
F;SMC;15,1128;0,3919
F;SMC;13,4782;0,8109
F;SMC;15,8534;0,2208
F;SMC;13,1824;0,9072
F;SMC;15,8466;0,1901
1;LMC;13,9452;0,4076
1;LMC;14,3302;0,3149
1;LMC;12,9682;0,6984
1;LMC;15,0586;0,1023
1;LMC;14,328;0,304
1;LMC;15,024;0,0882
1;LMC;14,0594;0,3924
1;LMC;17,2026;-0,5304
1;LMC;14,327;0,3192
1;LMC;13,8748;0,4361
1;LMC;17,155;-0,4783
1;LMC;14,3154;0,3197
1;LMC;14,3376;0,2943
1;LMC;14,462;0,3461
1;LMC;14,139;0,3647
1;LMC;16,764;-0,4451
1;LMC;15,1618;0,1008
1;LMC;14,2229;0,3328
1;LMC;13,8046;0,4946
1;LMC;14,4268;0,2703
1;LMC;15,5032;-0,0368
1;LMC;15,9052;-0,1647
1;LMC;13,908;0,4434
1;LMC;14,3352;0,2986
1;LMC;13,6286;0,5326
1;LMC;13,7934;0,4842
1;LMC;14,3979;0,2817
1;LMC;14,0496;0,4238
1;LMC;14,4368;0,2939
1;LMC;14,3242;0,3164
1;LMC;12,6825;0,7719
1;LMC;13,846;0,4483
1;LMC;14,5746;0,2727
1;LMC;14,5171;0,2641
1;LMC;14,9218;0,1209
1;LMC;14,2248;0,3411
1;LMC;14,3478;0,3109
1;LMC;14,0999;0,357
1;LMC;14,5558;0,2632
1;LMC;13,7602;0,4936
1;LMC;14,5354;0,2775
1;LMC;13,5663;0,5364
1;LMC;17,0694;-0,4754
1;LMC;14,2915;0,3346
1;LMC;14,7311;0,218
1;LMC;13,6888;0,5417
1;LMC;14,627;0,2133
1;LMC;13,4404;0,597
1;LMC;14,7168;0,2212
1;LMC;15,0594;0,3161
1;LMC;15,0425;0,1061
1;LMC;16,815;-0,4438
1;LMC;16,001;-0,1914
1;LMC;14,4216;0,2488
1;LMC;14,4748;0,286
1;LMC;13,8631;0,466
1;LMC;14,676;0,2098
1;LMC;14,4089;0,3046
1;LMC;14,2384;0,3559
1;LMC;14,2154;0,3397
1;LMC;14,059;0,3829
1;LMC;14,7006;0,2089
1;LMC;13,2151;0,6923
1;LMC;14,5228;0,2442
1;LMC;14,1972;0,3233
1;LMC;14,7161;0,2052
1;LMC;14,4328;0,2944
1;LMC;14,4018;0,2906
1;LMC;14,7142;0,2083
1;LMC;14,5522;0,2311
1;LMC;13,6784;0,5121
1;LMC;14,396;0,31
1;LMC;14,5408;0,2582
1;LMC;13,9204;0,4699
1;LMC;14,3842;0,308
1;LMC;13,9161;0,4451
1;LMC;14,5161;0,2751
1;LMC;16,6794;-0,4003
1;LMC;14,2213;0,3356
1;LMC;14,0804;0,3867
1;LMC;14,3438;0,2957
1;LMC;16,7434;-0,4476
1;LMC;14,4333;0,2808
1;LMC;14,3312;0,2889
1;LMC;14,504;0,247
1;LMC;13,2101;0,6412
1;LMC;13,8247;0,4442
1;LMC;13,962;0,4153
1;LMC;14,0806;0,3598
1;LMC;14,4793;0,2675
1;LMC;14,8813;0,1499
1;LMC;14,5757;0,2212
1;LMC;14,409;0,2996
1;LMC;13,8864;0,4335
1;LMC;14,1462;0,3252
1;LMC;13,4634;0,5562
1;LMC;14,034;0,4077
1;LMC;17,5882;-0,6029
1;LMC;13,7698;0,4653
1;LMC;14,3287;0,3083
1;LMC;13,2086;0,6234
1;LMC;13,5732;0,546
1;LMC;15,48;-0,014
1;LMC;13,1248;0,6751
1;LMC;17,1166;-0,528
1;LMC;13,9133;0,4573
1;LMC;15,0072;0,1038
1;LMC;14,1087;0,3766
1;LMC;17,1206;-0,5551
1;LMC;14,6866;0,2054
1;LMC;13,4114;0,5868
1;LMC;15,8548;-0,1511
1;LMC;12,2802;0,6877
1;LMC;17,1984;-0,5196
1;LMC;13,2713;0,6421
1;LMC;14,537;0,2466
1;LMC;15,4264;0,0006
1;LMC;15,5466;-0,0351
1;LMC;14,5549;0,3135
1;LMC;14,8506;0,1502
1;LMC;15,1214;0,0971
1;LMC;14,0284;0,3934
1;LMC;13,0608;0,6455
1;LMC;14,4624;0,2676
1;LMC;15,2442;0,0527
1;LMC;13,9045;0,4276
1;LMC;14,0536;0,3947
1;LMC;14,0503;0,3833
1;LMC;14,2145;0,3506
1;LMC;14,3653;0,2799
1;LMC;12,2534;0,6564
1;LMC;13,4538;0,5395
1;LMC;16,7458;-0,3898
1;LMC;13,799;0,4515
1;LMC;14,3382;0,2787
1;LMC;13,6368;0,5072
1;LMC;13,4912;0,5308
1;LMC;14,8163;0,1739
1;LMC;13,8256;0,4412
1;LMC;14,3908;0,2858
1;LMC;14,9267;0,0972
1;LMC;14,5064;0,2072
1;LMC;13,899;0,4303
1;LMC;14,0764;0,3825
1;LMC;14,871;0,1848
1;LMC;14,8902;0,1544
1;LMC;14,1546;0,3697
1;LMC;14,7806;0,1531
1;LMC;15,3816;0,0162
1;LMC;14,1212;0,3378
1;LMC;14,6768;0,1847
1;LMC;14,229;0,3145
1;LMC;14,3439;0,2859
1;LMC;14,5225;0,183
1;LMC;14,222;0,3029
1;LMC;14,6786;0,2644
1;LMC;14,2882;0,3067
1;LMC;17,304;-0,4965
1;LMC;13,2234;0,6359
1;LMC;14,1998;0,341
1;LMC;16,9782;-0,4488
1;SMC;14,2801;0,5215
1;SMC;16,7184;-0,1413
1;SMC;15,6902;0,0745
1;SMC;16,1686;-0,057
1;SMC;14,6436;0,3746
1;SMC;16,573;-0,1489
1;SMC;15,4925;0,1575
1;SMC;15,0159;0,3255
1;SMC;15,5657;0,1226
1;SMC;14,3219;0,4484
1;SMC;16,5712;-0,1446
1;SMC;16,1988;-0,0829
1;SMC;15,4376;0,1613
1;SMC;13,6344;0,5874
1;SMC;14,3778;0,4716
1;SMC;14,2394;0,5057
1;SMC;15,8777;0,0206
1;SMC;16,7138;-0,1735
1;SMC;15,7367;0,0683
1;SMC;14,7922;0,3067
1;SMC;17,9934;-0,5486
1;SMC;14,1358;0,5249
1;SMC;14,8562;0,3176
1;SMC;15,5588;0,1312
1;SMC;14,3;0,5272
1;SMC;15,6038;0,0537
1;SMC;14,5812;0,4347
1;SMC;14,8804;0,3115
1;SMC;14,3614;0,4934
1;SMC;16,4298;-0,0449
1;SMC;15,8712;0,0365
1;SMC;14,3527;0,5141
1;SMC;15,639;0,0993
1;SMC;14,0709;0,4997
1;SMC;16,0837;0,0029
1;SMC;14,7445;0,4165
1;SMC;16,23;-0,0246
1;SMC;15,1252;0,2608
1;SMC;16,255;-0,043
1;SMC;15,4152;0,2079
1;SMC;15,6954;0,0998
1;SMC;14,8665;0,3692
1;SMC;15,7832;0,0378
1;SMC;14,8404;-0,2293
1;SMC;15,9228;0,0104
1;SMC;16,1484;0,0015
1;SMC;15,8728;0,0054
1;SMC;14,8986;0,2908
1;SMC;16,731;-0,2169
1;SMC;15,2766;0,1077
1;SMC;15,5933;0,0706
1;SMC;14,6399;0,3879
1;SMC;16,4613;-0,0989
1;SMC;15,1788;0,1832
1;SMC;16,2002;-0,0848
1;SMC;15,0008;0,2784
1;SMC;14,7586;0,2794
1;SMC;16,3034;-0,118
1;SMC;16,4006;-0,1251
1;SMC;15,849;-0,0155
1;SMC;16,3728;-0,0437
1;SMC;13,959;0,5954
1;SMC;15,9233;0,0135
1;SMC;15,1752;0,2438
1;SMC;14,8222;0,3179
1;SMC;16,0276;0,0558
1;SMC;15,2084;0,1235
1;SMC;16,3546;-0,1292
1;SMC;14,5508;0,4422
1;SMC;15,656;0,1128
1;SMC;15,2515;0,2473
1;SMC;15,8121;0,0231
1;SMC;15,6758;0,0838
1;SMC;16,729;-0,1389
1;SMC;16,2468;-0,126
1;SMC;13,9121;0,5834
1;SMC;14,368;0,4634
1;SMC;15,7206;0,0583
1;SMC;15,6693;0,0931
1;SMC;16,2687;-0,0599
1;SMC;15,0676;0,227
1;SMC;15,5143;0,1668
1;SMC;15,7076;0,0811
1;SMC;15,566;0,0386
1;SMC;16,1032;-0,0477
1;SMC;16,2852;-0,0936
1;SMC;13,9415;0,5344
1;SMC;13,7318;0,6038
1;SMC;14,6932;0,2731
1;SMC;17,5597;-0,4531
1;SMC;15,6816;0,0183
1;SMC;16,6984;-0,0744
1;SMC;15,0062;0,2869
1;SMC;15,8423;0,0837
1;SMC;15,6786;0,1166
1;SMC;14,6876;0,3651
1;SMC;15,5642;0,1374
1;SMC;16,8114;-0,1078
1;SMC;14,795;0,2782
1;SMC;14,2601;0,4012
1;SMC;16,4018;-0,1529
1;SMC;14,9727;0,2929
1;SMC;15,5267;0,1388
1;SMC;15,0455;0,2939
1;SMC;16,1594;-0,0279
1;SMC;15,6552;0,0574
1;SMC;14,4008;0,4278
1;SMC;16,1806;-0,0993
1;SMC;15,8383;0,0532
1;SMC;15,4704;0,1488
1;SMC;16,3872;-0,0714
1;SMC;14,7915;0,3349
1;SMC;13,9011;0,5528
1;SMC;16,5788;-0,1133
1;SMC;13,9728;0,5471
1;SMC;15,8312;0,048
1;SMC;15,696;0,0947
1;SMC;16,378;-0,0909
1;SMC;15,3721;0,1404
1;SMC;14,9808;0,2511
1;SMC;15,7881;0,0277
1;SMC;15,7657;0,0796
1;SMC;15,9406;0,0803
1;SMC;15,5712;0,1499
1;SMC;15,4664;0,1231
1;SMC;16,3175;-0,0522
1;SMC;15,4929;0,1124
1;SMC;13,5586;0,3835
1;SMC;16,205;-0,0705
1;SMC;15,55;0,08
1;SMC;17,5096;-0,2768
1;SMC;15,8832;0,0417
1;SMC;17,738;-0,542
1;SMC;14,5475;0,4257
1;SMC;15,4079;0,0751
1;SMC;16,2626;0,0103
1;SMC;14,5742;0,3754
1;SMC;16,521;-0,1554
1;SMC;16,791;-0,1832
1;SMC;15,4673;0,1727
1;SMC;14,2996;0,4629
1;SMC;13,6418;0,6525
1;SMC;15,7457;0,0729
1;SMC;15,4886;0,1447
1;SMC;14,7568;0,3357
1;SMC;15,482;0,1373
1;SMC;16,1634;-0,0447
1;SMC;15,7054;0,1234
1;SMC;14,5147;0,4154
1;SMC;15,0815;0,2683
1;SMC;15,992;-0,0153
1;SMC;14,3333;0,4373
1;SMC;15,3798;0,1507
1;SMC;15,957;-0,0025
1;SMC;15,889;0,0482
1;SMC;16,3458;-0,0707
1;SMC;15,565;0,17
1;SMC;15,0304;0,273
1;SMC;14,0869;0,4998
1;SMC;14,986;0,2767
1;SMC;16,144;-0,0551
1;SMC;15,5166;0,1347
1;SMC;14,3772;0,4966
1;SMC;15,8712;0,0196
1;SMC;14,6147;0,3938
1;SMC;16,7266;-0,1534
1;SMC;15,6266;0,1039
1;SMC;14,3126;0,4288
1;SMC;15,9238;-0,016
1;SMC;16,1556;-0,0916
1;SMC;14,6832;0,3555
1;SMC;14,9996;0,3125
1;SMC;14,8072;0,313
1;SMC;17,2238;-0,2249
1;SMC;14,2168;0,4893
1;SMC;16,0782;-0,0494
1;SMC;15,9124;0,0302
1;SMC;14,6897;0,3772
1;SMC;14,8998;0,317
1;SMC;14,3068;0,4708
1;SMC;14,9732;0,2529
1;SMC;16,1034;-0,0252
1;SMC;15,2416;0,2186
1;SMC;15,9578;-0,0056
1;SMC;14,605;0,3675
1;SMC;15,3892;0,1909
1;SMC;14,1306;0,5392
1;SMC;14,2198;0,4472
1;SMC;15,9806;0,1076
1;SMC;17,3222;-0,3888
1;SMC;14,8756;0,3077
1;SMC;16,4862;-0,1431
1;SMC;15,453;0,1643
1;SMC;15,719;0,105
1;SMC;15,0462;0,2544
1;SMC;14,3558;0,4541
1;SMC;13,7118;0,6472
1;SMC;14,9858;0,3054
1;SMC;14,7582;0,3293
1;SMC;15,8872;0,0343
1;SMC;14,2318;0,4783
1;SMC;15,7902;0,1023
1;SMC;15,7548;0,0084
1;SMC;16,3536;-0,1291
1;SMC;15,7356;0,0787
1;SMC;15,0988;0,2505
1;SMC;15,007;0,1926
1;SMC;15,0572;0,2629
1;SMC;15,4202;0,1177
1;SMC;14,5873;0,4062
1;SMC;14,274;0,472
1;SMC;15,953;0,032
1;SMC;15,1688;0,1666
1;SMC;15,4486;0,1694
1;SMC;16,2714;-0,084
1;SMC;14,1066;0,444
1;SMC;14,1883;0,4876
1;SMC;14,6876;0,3783
1;SMC;16,2804;-0,0307
1;SMC;16,004;0,0296
1;SMC;15,5427;0,0665
1;SMC;15,2691;0,1932
1;SMC;15,0723;0,2626
1;SMC;16,4086;-0,135
1;SMC;16,1279;-0,0629
1;SMC;14,6822;0,3247
1;SMC;16,1232;-0,1099
1;SMC;14,3967;0,4784
1;SMC;16,1678;-0,019
1;SMC;14,3868;0,4022
1;SMC;14,738;0,3264
1;SMC;15,8982;0,0036
1;SMC;16,0884;-0,0763
1;SMC;14,7889;0,3277
1;SMC;15,5037;0,1452
1;SMC;14,9974;0,3175
1;SMC;16,1114;-0,0793
1;SMC;15,5855;0,0736
1;SMC;15,1194;0,2507
1;SMC;15,1229;0,2498
1;SMC;15,5506;0,0998
1;SMC;15,8262;0,0085
1;SMC;17,6762;-0,4719
1;SMC;15,512;0,1091
1;SMC;15,1242;0,2304
1;SMC;14,8618;0,2606
1;SMC;15,8314;-0,0355
1;SMC;13,9661;0,5273
1;SMC;15,7528;0,0473
1;SMC;15,4834;0,1461
1;SMC;16,1654;0,0084
1;SMC;17,02;-0,0819
1;SMC;15,7764;0,0479
1;SMC;15,1877;0,2523
1;SMC;15,2879;0,1914
1;SMC;16,2964;-0,0454
1;SMC;15,5908;0,1223
1;SMC;15,6662;0,0394
1;SMC;15,5124;0,1418
1;SMC;14,876;0,2962
1;SMC;16,015;-0,0057
1;SMC;14,6491;0,4071
1;SMC;16,5376;-0,1862
1;SMC;16,4474;-0,1131
1;SMC;16,0558;0,0361
1;SMC;16,6338;-0,2435
1;SMC;18,2798;-0,5471
1;SMC;15,7256;0,0648
1;SMC;16,963;-0,2991
1;SMC;15,5069;0,1115
1;SMC;15,0298;0,1803
1;SMC;16,3346;-0,1174
1;SMC;14,794;0,3238
1;SMC;14,271;0,4877
1;SMC;15,9154;0,0438
1;SMC;16,5047;-0,1339
1;SMC;16,65;-0,1978
1;SMC;14,8017;0,3421
1;SMC;15,397;0,1778
1;SMC;16,8134;-0,2104
1;SMC;14,3519;0,421
1;SMC;14,6731;0,3168
1;SMC;15,2232;0,2349
1;SMC;14,6852;0,3608
1;SMC;14,9719;0,1979
1;SMC;15,1469;0,2306
1;SMC;15,2132;0,1439
1;SMC;14,788;0,3559
1;SMC;15,638;0,131
1;SMC;15,1227;0,1846
1;SMC;15,7846;-0,0333
1;SMC;16,1864;-0,0533
1;SMC;16,4067;-0,0201
1;SMC;16,7493;-0,236
1;SMC;16,5681;-0,2147
1;SMC;15,6974;0,0783
1;SMC;16,1395;-0,074
1;SMC;14,7655;0,3273
1;SMC;14,5638;0,3947
1;SMC;16,6594;-0,1952
1;SMC;16,1283;-0,0393
1;SMC;15,9034;0,0257
1;SMC;15,8515;0,0495
1;SMC;15,0717;0,3022
1;SMC;15,3598;0,1681
1;SMC;14,4274;0,4869
1;SMC;16,2396;-0,0553
1;SMC;16,082;-0,0294
1;SMC;14,8533;0,2512
1;SMC;14,6503;0,3586
1;SMC;16,1;-0,0353
1;SMC;15,6848;0,1708
1;SMC;15,9834;0,0201
1;SMC;14,3646;0,4274
1;SMC;15,285;0,1942
1;SMC;15,1247;0,2598
1;SMC;15,7448;0,0919
1;SMC;15,6758;0,1366
1;SMC;15,0902;0,226
1;SMC;14,0126;0,5439
1;SMC;15,9319;-0,082
1;SMC;15,0558;0,2398
1;SMC;14,5532;0,4375
1;SMC;14,8176;0,3557
1;SMC;15,1869;0,2378
1;SMC;14,5042;0,3989
1;SMC;14,7118;0,2721
1;SMC;14,5803;0,3939
1;SMC;15,4836;0,1186
1;SMC;15,2548;0,2071
1;SMC;15,5388;0,1499
1;SMC;15,507;0,1285
1;SMC;13,958;0,5414
1;SMC;16,4458;-0,0405
1;SMC;15,6919;0,0892
1;SMC;14,4196;0,4557
1;SMC;15,7577;0,03
1;SMC;16,382;-0,1317
1;SMC;14,456;0,4701
1;SMC;15,5165;0,0565
1;SMC;16,198;-0,0138
1;SMC;16,1511;-0,0355
1;SMC;14,3661;0,4568
1;SMC;15,088;0,2109
1;SMC;14,3802;0,4206
1;SMC;14,7786;0,2707
1;SMC;15,2855;0,3013
1;SMC;15,3114;0,1119
1;SMC;15,43;0,1134
1;SMC;16,1082;-0,0503
1;SMC;16,2348;-0,022
1;SMC;15,9953;-0,0417
1;SMC;15,2678;0,1952
1;SMC;15,1298;0,2325
1;SMC;15,1712;0,2456
1;SMC;15,5435;0,1342
1;SMC;15,8772;0,0307
A simple solution:
arrays=df.groupby(['Mode','Cloud']).apply(lambda grp : residual_function(grp))
residuals_value=[]
[residuals_value.extend(elem.tolist()) for elem in arrays]
df["residuals"]=residuals_value

Issues Querying and Downloading Sentinel-3 OLCI Data with Sentinelsat

I am working with Sentinel-3 OLCI Level-2 Data Products with the Sentinelsat API and am having issues querying and exceeding my data download quota. Overall, I would like to write a program that accepts a date range and a specific geographic location, then downloads a dataframe of all values in the "Oa04_radiance"-band within the specified dates for that location. This is what I have so far:
from sentinelsat import SentinelAPI, read_geojson, geojson_to_wkt
from datetime import date
from geojson import Feature, Point, Polygon
api = SentinelAPI('user', 'password', 'https://apihub.copernicus.eu/apihub')
lon = -123.312383
lat = 49.319269
my_point = Point((lon, lat))
footprint = geojson_to_wkt(my_point)
products = api.query(footprint,
date=(date(2021, 1, 1), date(2021, 6, 15)),
platformname='Sentinel-3',
producttype='OL_2_LRR___',
cloudcoverpercentage=(0, 80))
products_df = api.to_dataframe(products)
api.download_all(products_df.index)
Error Output:
Traceback (most recent call last):
File "C:/Users/t7dej/Desktop/Turbid Time Local/SentSat/SenSat_mdl.py", line 48, in <module>
api.download_all(products_df_sorted.index)
File "E:\Software\Anaconda\lib\site-packages\sentinelsat\sentinel.py", line 723, in download_all
is_online = not self.trigger_offline_retrieval(pid)
File "E:\Software\Anaconda\lib\site-packages\sentinelsat\sentinel.py", line 636, in trigger_offline_retrieval
raise LTAError(msg, r)
sentinelsat.exceptions.LTAError: HTTP status 403 Forbidden: User quota exceeded: MediaRegulationException : An exception occured while creating a stream: Maximum number of 4 concurrent flows achieved by the user
Even when I set my api.query(limit=1) I am receiving this error message. The products_df is 173 MB and has a geometry column with a value of :
MULTIPOLYGON (((-146.081 -49.2196, -145.768 -48.2668, -145.201 -46.4727, -144.658 -44.6765, -144.135 -42.8782, -143.63 -41.0787, -143.142 -39.2767, -142.667 -37.4733, -142.204 -35.6694, -141.753 -33.863, -141.312 -32.0559, -140.878 -30.2474, -140.453 -28.4377, -140.033 -26.6268, -139.62 -24.8151, -139.211 -23.0024, -138.806 -21.1887, -138.404 -19.3744, -138.006 -17.5593, -137.609 -15.7434, -137.213 -13.927, -136.819 -12.1101, -136.425 -10.2928, -136.031 -8.47512, -135.636 -6.65724, -135.24 -4.83889, -134.843 -3.02078, -134.443 -1.2025, -134.04 0.61575, -133.634 2.43352, -133.224 4.25148, -132.81 6.06894, -132.391 7.88578, -131.965 9.702120000000001, -131.534 11.5179, -131.095 13.3329, -130.649 15.1468, -130.194 16.9598, -129.729 18.7716, -129.253 20.582, -128.767 22.3916, -128.267 24.1992, -127.753 26.0043, -127.224 27.8086, -126.678 29.6103, -126.113 31.4098, -125.527 33.2067, -124.919 35.0007, -124.286 36.7919, -123.624 38.5795, -122.932 40.3637, -122.205 42.1436, -121.44 43.9188, -120.631 45.6892, -119.773 47.4541, -118.861 49.2124, -117.886 50.9639, -116.841 52.7073, -115.714 54.4414, -114.495 56.1652, -113.168 57.8769, -111.717 59.5747, -110.12 61.256, -108.352 62.9179, -106.382 64.5568, -104.173 66.16800000000001, -101.677 67.7456, -98.8378 69.2821, -95.5855 70.7672, -91.83369999999999 72.1889, -87.4884 73.5312, -82.44029999999999 74.7715, -76.5813 75.8849, -69.83199999999999 76.83880000000001, -62.1773 77.59399999999999, -53.7161 78.1165, -44.6976 78.3725, -35.5051 78.3451, -26.5677 78.03619999999999, -18.2457 77.46639999999999, -10.7612 76.6694, -4.18714 75.6829, 3.64587789990069e-15 74.8448206506109, 1.50843 74.5429, 6.41354 73.28100000000001, 10.6367 71.9226, 14.2845 70.4875, 17.4519 68.99160000000001, 20.2206 67.4469, 21.6077 67.77589999999999, 23.0264 68.09050000000001, 24.4837 68.3925, 25.9603 68.679, 27.4944 68.9546, 29.0662 69.2161, 30.6951 69.46680000000001, 32.3417 69.6985, 34.024 69.9145, 35.7316 70.10760000000001, 37.4774 70.2903, 39.2531 70.4558, 41.0752 70.6046, 42.9054 70.7343, 44.7579 70.8455, 46.6297 70.938, 48.5174 71.01139999999999, 50.4176 71.0655, 52.3019 71.10290000000001, 52.0832 72.8877, 51.8771 74.6721, 51.6876 76.4562, 51.5212 78.2398, 51.3914 80.0231, 51.3212 81.80629999999999, 51.3582 83.58880000000001, 51.57959163346614 85.05115000000001, 3.911836325497215e-15 85.05115000000001, -133.5599156744917 85.05115000000001, -133.35 83.92870000000001, -133.276 82.14530000000001, -133.33 80.3614, -133.453 78.57859999999999, -133.611 76.7949, -133.799 75.0104, -134.002 73.226, -134.218 71.4409, -134.445 69.6553, -134.678 67.8693, -134.918 66.0826, -135.163 64.2954, -135.413 62.5074, -135.666 60.719, -135.923 58.93, -136.183 57.1401, -136.447 55.3495, -136.712 53.5582, -136.982 51.7664, -137.253 49.9738, -137.528 48.1799, -137.805 46.3858, -138.085 44.5911, -138.368 42.7954, -138.653 40.9991, -138.942 39.2019, -139.234 37.4039, -139.528 35.6058, -139.826 33.8069, -140.128 32.0072, -140.433 30.207, -140.741 28.4055, -141.054 26.6057, -141.37 24.8042, -141.691 23.0012, -142.016 21.1992, -142.346 19.3968, -142.68 17.5942, -143.02 15.7915, -143.365 13.9887, -143.716 12.1859, -144.073 10.3832, -144.436 8.58104, -144.806 6.77937, -145.183 4.97715, -145.567 3.17639, -145.96 1.37623, -146.361 -0.423407, -146.77 -2.22189, -147.19 -4.01968, -147.62 -5.81594, -148.06 -7.61083, -148.512 -9.40438, -148.976 -11.1962, -149.454 -12.9862, -149.946 -14.7742, -150.453 -16.5599, -150.975 -18.343, -151.516 -20.1234, -152.076 -21.901, -152.655 -23.6751, -153.257 -25.4461, -153.883 -27.2129, -154.534 -28.9754, -155.214 -30.7335, -155.923 -32.487, -156.667 -34.2331, -157.447 -35.9749, -158.267 -37.7106, -159.131 -39.4374, -160.043 -41.1566, -161.008 -42.867, -162.033 -44.5672, -162.604 -45.4664, -161.805 -45.7119, -160.989 -45.9615, -160.165 -46.2051, -159.335 -46.4427, -158.497 -46.6742, -157.652 -46.8994, -156.79 -47.1211, -155.932 -47.3333, -155.068 -47.539, -154.207 -47.742, -153.329 -47.9343, -152.444 -48.1198, -151.538 -48.3011, -150.642 -48.4725, -149.738 -48.6368, -148.842 -48.7908, -147.928 -48.9408, -147.008 -49.0835, -146.081 -49.2196)))
I have specified a geojson point object in the products query and am wondering why it is returning such a large multipolygon object in the products_df. I am thinking this is why products_df is so large and my quota is exceeded. Does anyone have any recommendations for this? Also, is it possible to query only the specific band 'Oa04_radiance' before downloading since I do not need any of the other bands from the Sentinel-3 OLCI Level-2 data products.

AttributeError: 'int' object has no attribute 'state'

This error pops up when I run the code I currently have.
Note : I did not write the code, I am simply trying to understand what's going on so that I can port it to a newer version of TuLiP.
Traceback (most recent call last):
File "vms5.py", line 270, in <module>
states = [aut_state.state]
AttributeError: 'int' object has no attribute 'state'
Line 270 says :
states = [aut_state.state]
I tried looking for state and found this
Line 249 :
state = dict(temp = Tmax, w = 0, h = 0, b = Bmax, a = 0, c = 0, nw = 0)
and aut_state at Lines 259 and 260
aut = createAut(aut_file = autfile, varnames = env_vars.keys() + sys_disc_vars.keys())
aut_state = aut.findNextAutState(current_aut_state=None, env_state=state)
Other terms with aut
Line 47 :
autfile = testfile+'.aut'
and Lines 223-234
# Check realizability
realizability = jtlvint.checkRealizability(smv_file=smvfile, spc_file=spcfile, \
aut_file=autfile, verbose=3)
# Compute an automaton
jtlvint.computeStrategy(smv_file=smvfile, spc_file=spcfile, aut_file=autfile, \
priority_kind=3, verbose=3)
aut = automaton.Automaton(autfile, [], 3)
That's everything in the code that has aut related terms
If you want more info, please let me know
EDIT
I tried adding print(aut_state) before line 270 and got -1 as an answer.
So aut is an int. Ints don't have an attribute called state. Whatever set the variable aut, set it with an int. Looks like an error code to me. Look at the code for findNextAutState - what does it return when there are no more AutStates? -1?
Probably a condition check missing.
from the traceback it is clear that aut_state is an integer, and integer cannot have any attribute called state. Your main Code problem lies inside the createAut() which creates an aut object or inside the findNextAutState() function which returns aut_state.

Categories

Resources