Print new line each array index in a loop reach certain number - python

I have this array (as an example of my problem) :
[0.8067, 0.7152, 0.4551, 0.7519, 0.3419, 0.7161, 0.3793, 0.6859, 0.4205, 0.5129, 0.5534, 0.5995, 0.4999, 0.5136, 0.8194, 0.4855, 0.6822, 0.4924, 0.5359, 0.4083, 0.5078, 0.7260, 0.6876, 0.7033, 0.5777, 0.4515, 0.5460,0.5842, 0.7296, 0.7570, 0.4579, 0.3252, 0.4683, 0.3646, 0.3220, 0.4150,0.3263, 0.6402, 0.3184, 0.7087, 0.2958, 0.5384, 0.5462, 0.3933, 0.2963,0.6883, 0.4766, 0.5430, 0.4943, 0.2810, 0.4785, 0.5618, 0.6941, 0.4943,0.3793, 0.7629, 0.6058, 0.6419, 0.4902, 0.3158, 0.7923, 0.7335, 0.5624,0.5390, 0.5337, 0.8333, 0.7519, 0.6591, 0.5301, 0.3020, 0.8187, 0.8084,0.3412, 0.7912, 0.6240, 0.4296, 0.4908, 0.6560, 0.7366, 0.5219, 0.8128,0.3683, 0.6037, 0.4570, 0.3640, 0.4717, 0.5948, 0.6294, 0.8222, 0.7323,0.4344, 0.4371, 0.5013, 0.2913, 0.4335, 0.4046, 0.6788, 0.5917, 0.8369,0.8983, 0.3981, 0.8857, 0.4309, 0.6197, 0.7020, 0.3666, 0.5837, 0.3259,0.7193, 0.3719, 0.7098, 0.4088, 0.5421, 0.5039, 0.3664, 0.5499, 0.8648, 0.3217, 0.7696, 0.5970, 0.4611, 0.3465, 0.6396, 0.6688, 0.5773, 0.7444,0.7232, 0.5695, 0.5801, 0.5218, 0.8099, 0.6983, 0.5733, 0.3286, 0.6736, 0.6470, 0.9196, 0.7589, 0.7610, 0.8454, 0.6261, 0.6229, 0.7600, 0.5022,0.3035, 0.5229, 0.5353, 0.4962, 0.8466, 0.1817, 0.5271, 0.6928, 0.7898,0.4182, 0.5234, 0.4112, 0.4812, 0.7522, 0.4209, 0.7217, 0.6545, 0.6954,0.3139, 0.5253, 0.5467, 0.3606, 0.6640, 0.7399, 0.7965, 0.5742, 0.5729,0.6213, 0.7981, 0.5613, 0.4904, 0.7292, 0.5686, 0.8421, 0.7316, 0.6408,0.6550, 0.3902, 0.5353, 0.5459, 0.4035, 0.3390, 0.4407, 0.7370, 0.4466,0.4029, 0.8216, 0.4862, 0.7136, 0.3544, 0.7967, 0.2909, 0.4384, 0.5505,0.6768, 0.5122, 0.6042, 0.5240, 0.4299, 0.3714, 0.6224, 0.6549, 0.7901,0.7289, 0.7580, 0.5656, 0.7841, 0.7520, 0.8379, 0.4449, 0.4860, 0.6904,0.7279, 0.6378, 0.4493, 0.5407, 0.6737, 0.5260, 0.8009, 0.6307, 0.6026,0.5197, 0.7532, 0.4754, 0.6674, 0.6768, 0.7643, 0.6705, 0.7871, 0.6135, 0.7762, 0.6081, 0.4060, 0.3688, 0.5848, 0.4235, 0.6011, 0.6949, 0.4410,0.8054, 0.7706, 0.3644, 0.6820, 0.6351, 0.4282, 0.4613, 0.7392, 0.5208,0.4409, 0.5589, 0.3401, 0.5811, 0.7109, 0.3245, 0.5017, 0.6650, 0.5949,0.5680, 0.4445, 0.7482, 0.3044, 0.7760, 0.4396, 0.4067, 0.3840, 0.7426,0.5989, 0.5169, 0.7056, 0.4329, 0.5555]
What i want to do is each time the index value of the array reach 9, it will print a new line before it continues printing the next array value.
So what i expect it will print the array like this :
[0.8067, 0.7152, 0.4551, 0.7519, 0.3419, 0.7161, 0.3793, 0.6859, 0.4205,
0.5129, 0.5534, 0.5995, 0.4999, 0.5136, 0.8194, 0.4855, 0.6822, 0.4924,
0.5359, 0.4083, 0.5078, 0.7260, 0.6876, 0.7033, 0.5777, 0.4515, 0.5460,
0.5842, 0.7296, 0.7570, 0.4579, 0.3252, 0.4683, 0.3646, 0.3220, 0.4150,
0.3263, 0.6402, 0.3184, 0.7087, 0.2958, 0.5384, 0.5462, 0.3933, 0.2963,
0.6883, 0.4766, 0.5430, 0.4943, 0.2810, 0.4785, 0.5618, 0.6941, 0.4943,
0.3793, 0.7629, 0.6058, 0.6419, 0.4902, 0.3158, 0.7923, 0.7335, 0.5624,
0.5390, 0.5337, 0.8333, 0.7519, 0.6591, 0.5301, 0.3020, 0.8187, 0.8084,
0.3412, 0.7912, 0.6240, 0.4296, 0.4908, 0.6560, 0.7366, 0.5219, 0.8128,
0.3683, 0.6037, 0.4570, 0.3640, 0.4717, 0.5948, 0.6294, 0.8222, 0.7323,
0.4344, 0.4371, 0.5013, 0.2913, 0.4335, 0.4046, 0.6788, 0.5917, 0.8369,
0.8983, 0.3981, 0.8857, 0.4309, 0.6197, 0.7020, 0.3666, 0.5837, 0.3259,
0.7193, 0.3719, 0.7098, 0.4088, 0.5421, 0.5039, 0.3664, 0.5499, 0.8648,
0.3217, 0.7696, 0.5970, 0.4611, 0.3465, 0.6396, 0.6688, 0.5773, 0.7444,
0.7232, 0.5695, 0.5801, 0.5218, 0.8099, 0.6983, 0.5733, 0.3286, 0.6736,
0.6470, 0.9196, 0.7589, 0.7610, 0.8454, 0.6261, 0.6229, 0.7600, 0.5022,
0.3035, 0.5229, 0.5353, 0.4962, 0.8466, 0.1817, 0.5271, 0.6928, 0.7898,
0.4182, 0.5234, 0.4112, 0.4812, 0.7522, 0.4209, 0.7217, 0.6545, 0.6954,
0.3139, 0.5253, 0.5467, 0.3606, 0.6640, 0.7399, 0.7965, 0.5742, 0.5729,
0.6213, 0.7981, 0.5613, 0.4904, 0.7292, 0.5686, 0.8421, 0.7316, 0.6408,
0.6550, 0.3902, 0.5353, 0.5459, 0.4035, 0.3390, 0.4407, 0.7370, 0.4466,
0.4029, 0.8216, 0.4862, 0.7136 ]
How can i achieve this? Thank you in advance.

Try this
for i in range(len(arr)):
if i%9 == 0:
print()
print(arr[i], end = " ")

Related

Flatten Array from 2d to 1D with same length

I need to flat (2,300) array to (,300) array. Maybe flat is not right term but, I want to add each element in order then get mean of these 2 element and create (,300).I think manually change could be work , but since its loop i couldnt find solution yet.
Tried to read documentation but couldnt find it so far and search for numpy options but couldnt find solution yet.
[[ 1.56250000e-01 -2.18750000e-01 -2.81250000e-01 -2.26562500e-01
-8.78906250e-02 4.08203125e-01 1.66992188e-01 1.68457031e-02
3.20312500e-01 -5.22460938e-02 -3.19824219e-02 5.90820312e-02
1.32812500e-01 -1.55273438e-01 2.87109375e-01 1.30859375e-01
8.88671875e-02 4.56542969e-02 -4.37011719e-02 -4.96093750e-01
5.85937500e-02 1.35742188e-01 -1.81884766e-02 7.32421875e-02
-2.16796875e-01 5.66406250e-02 -1.66015625e-01 4.66308594e-02
-1.04492188e-01 1.65039062e-01 -2.61718750e-01 3.27148438e-02
-2.12890625e-01 3.63281250e-01 -2.80761719e-02 -2.40234375e-01
1.21093750e-01 2.75390625e-01 -4.49218750e-02 -1.44531250e-01
-1.61132812e-01 1.75781250e-01 1.69921875e-01 2.94921875e-01
3.41796875e-01 3.02124023e-03 -4.90722656e-02 9.88769531e-03
2.21679688e-01 -3.53515625e-01 -2.83203125e-01 -1.35742188e-01
-2.53906250e-02 -2.14843750e-01 1.51367188e-01 3.73046875e-01
4.68750000e-02 2.40478516e-02 -1.77734375e-01 2.49023438e-01
-5.85937500e-02 9.94873047e-03 -1.93359375e-01 1.38671875e-01
-1.30859375e-01 -4.34570312e-02 -5.54199219e-02 2.50000000e-01
-2.36328125e-01 3.12500000e-01 4.66796875e-01 -6.05468750e-02
2.25585938e-01 -2.26562500e-01 -4.21875000e-01 -2.11914062e-01
-1.37695312e-01 -1.56250000e-01 2.39257812e-02 -3.55468750e-01
-2.28515625e-01 3.82812500e-01 8.49609375e-02 4.00390625e-01
-4.19921875e-02 9.91210938e-02 6.74438477e-03 3.59375000e-01
9.42382812e-02 6.93359375e-02 -2.16796875e-01 -2.34375000e-01
-2.02148438e-01 9.94873047e-03 1.38671875e-01 1.24511719e-01
2.89062500e-01 1.10839844e-01 -6.49414062e-02 -4.08203125e-01
-4.37500000e-01 2.33154297e-02 1.02539062e-01 2.19726562e-01
-2.10937500e-01 9.47265625e-02 2.94921875e-01 3.14941406e-02
2.92968750e-01 -1.28906250e-01 -1.65039062e-01 -1.66992188e-01
1.49414062e-01 -1.44531250e-01 -1.22558594e-01 6.39648438e-02
-2.48046875e-01 -3.30078125e-01 3.93066406e-02 -1.69921875e-01
4.80957031e-02 -2.01171875e-01 -4.10156250e-01 1.65039062e-01
2.29492188e-01 -1.45507812e-01 -2.53906250e-01 -2.27539062e-01
1.33789062e-01 6.83593750e-02 -8.69140625e-02 4.15039062e-02
-1.96289062e-01 1.69921875e-01 6.54296875e-02 6.34765625e-02
1.22070312e-01 1.28906250e-01 -4.29687500e-02 -1.12792969e-01
1.22070312e-01 -1.85546875e-01 -2.89062500e-01 -2.91015625e-01
-1.98242188e-01 2.50000000e-01 2.46093750e-01 2.14843750e-01
-1.81640625e-01 6.98852539e-03 3.35937500e-01 -1.90429688e-01
-1.95312500e-02 4.23828125e-01 -1.04980469e-01 -7.81250000e-02
-1.31835938e-01 -1.21582031e-01 -1.50756836e-02 1.92382812e-01
9.03320312e-03 1.70898438e-01 2.83203125e-01 -1.08398438e-01
-5.07812500e-02 -4.58984375e-02 -1.44531250e-01 -1.38671875e-01
-3.55468750e-01 -6.44531250e-02 -1.21582031e-01 2.12402344e-02
-4.68750000e-01 2.34375000e-02 -1.48437500e-01 -1.45507812e-01
6.98242188e-02 -4.23828125e-01 -3.73535156e-02 1.45507812e-01
-3.10546875e-01 1.82617188e-01 1.31835938e-01 6.49414062e-02
3.04687500e-01 2.11914062e-01 -1.84570312e-01 -7.22656250e-02
1.39648438e-01 5.22460938e-02 -1.01562500e-01 -2.07031250e-01
-3.98437500e-01 -2.91015625e-01 -1.37695312e-01 3.76953125e-01
-1.37695312e-01 -1.91406250e-01 -4.05273438e-02 -5.61523438e-02
2.20947266e-02 3.93066406e-02 -3.39843750e-01 9.32617188e-02
3.75000000e-01 3.66210938e-02 -2.14843750e-02 -1.67968750e-01
-3.12500000e-01 1.81884766e-02 1.75781250e-01 -2.79541016e-02
-4.02343750e-01 -5.20019531e-02 8.88671875e-02 -5.00000000e-01
-1.25976562e-01 -9.37500000e-02 -2.85156250e-01 -1.66015625e-01
-1.92382812e-01 2.38281250e-01 -7.78198242e-03 2.00195312e-01
-1.07421875e-01 2.71484375e-01 -1.25000000e-01 -1.30462646e-03
-3.88183594e-02 -2.34375000e-01 -1.07910156e-01 4.68750000e-02
8.78906250e-03 -2.53906250e-01 6.78710938e-02 -6.15234375e-02
-5.20019531e-02 -9.96093750e-02 2.59765625e-01 -4.07714844e-02
1.03515625e-01 1.22680664e-02 -1.94335938e-01 -3.65234375e-01
2.57812500e-01 -4.19921875e-02 6.73828125e-02 1.73828125e-01
3.29589844e-02 -2.13867188e-01 4.83989716e-05 7.86132812e-02
7.86132812e-02 1.38671875e-01 1.07910156e-01 -4.10156250e-01
1.72851562e-01 -1.78710938e-01 6.56127930e-03 1.19628906e-01
2.55859375e-01 -4.76074219e-02 5.32226562e-02 -3.20312500e-01
-2.57812500e-01 1.11816406e-01 -1.67968750e-01 -2.73437500e-01
-4.19921875e-01 2.07031250e-01 8.10546875e-02 -1.47460938e-01
6.28662109e-03 -5.61523438e-02 -6.25610352e-03 1.25000000e-01
3.10058594e-02 2.83203125e-02 -3.90625000e-01 -1.04003906e-01
1.82617188e-01 4.61425781e-02 -1.11389160e-03 2.50000000e-01
1.77734375e-01 -1.07910156e-01 -3.97949219e-02 -1.62109375e-01
1.44653320e-02 4.49218750e-01 -1.12792969e-01 1.76757812e-01
-1.74804688e-01 -3.04687500e-01 3.94531250e-01 -5.93261719e-02
1.15234375e-01 4.05883789e-03 -2.12890625e-01 2.98828125e-01]
[ 2.01171875e-01 -2.75878906e-02 -1.13281250e-01 1.90429688e-01
-2.98828125e-01 -1.91406250e-01 1.83593750e-01 -2.43164062e-01
3.61328125e-02 -9.32617188e-02 -1.60156250e-01 2.42919922e-02
4.71191406e-02 -1.25976562e-01 -2.24609375e-02 1.20117188e-01
-3.57055664e-03 -8.44726562e-02 1.56250000e-01 -4.39453125e-01
3.53515625e-01 2.23632812e-01 -3.44238281e-02 1.06445312e-01
-3.82812500e-01 3.02734375e-01 -1.39648438e-01 3.98437500e-01
1.69921875e-01 2.72216797e-02 -1.04492188e-01 1.77734375e-01
-1.93359375e-01 -1.87500000e-01 -6.05468750e-02 -2.29492188e-01
3.61328125e-01 3.02734375e-01 -1.21093750e-01 -2.23632812e-01
3.81469727e-05 8.88671875e-02 1.31835938e-01 -5.90820312e-02
2.43164062e-01 3.00292969e-02 1.88476562e-01 2.36816406e-02
2.20703125e-01 -2.96875000e-01 -3.63281250e-01 -1.71875000e-01
1.56250000e-01 -3.45703125e-01 -3.02734375e-01 2.34375000e-01
-1.01074219e-01 -1.25000000e-01 -6.34765625e-02 4.16015625e-01
-3.02734375e-01 -3.84765625e-01 -1.16699219e-01 1.08886719e-01
9.57031250e-02 1.26953125e-01 -6.00585938e-02 -2.59765625e-01
2.61718750e-01 1.72851562e-01 1.63085938e-01 2.91015625e-01
-5.85937500e-02 -3.30078125e-01 -2.15148926e-03 -5.46875000e-01
3.08593750e-01 1.92382812e-01 1.02539062e-01 4.12597656e-02
7.27539062e-02 2.37304688e-01 5.05371094e-02 4.15039062e-03
-4.19921875e-02 -9.71679688e-02 -3.57055664e-03 7.03125000e-02
2.96875000e-01 -1.36718750e-01 -2.68554688e-02 1.39648438e-01
-2.27539062e-01 -1.55273438e-01 -2.66113281e-02 2.16796875e-01
4.41406250e-01 -1.62109375e-01 5.51757812e-02 1.35498047e-02
-8.49609375e-02 -1.75781250e-01 -4.39453125e-01 6.73828125e-02
-2.31445312e-01 1.45507812e-01 3.61328125e-02 5.44433594e-02
2.53906250e-01 -2.91015625e-01 6.25000000e-02 -3.20312500e-01
1.57226562e-01 -6.10351562e-02 1.37695312e-01 2.35351562e-01
-2.13623047e-02 1.37695312e-01 -7.14111328e-03 -2.36328125e-01
9.88769531e-03 -1.04492188e-01 -2.43164062e-01 2.32421875e-01
4.27246094e-02 -3.81469727e-04 7.86132812e-02 1.96289062e-01
1.05957031e-01 7.32421875e-02 3.73535156e-02 -1.70898438e-01
-1.80664062e-01 3.88183594e-02 2.43164062e-01 -7.32421875e-02
1.74804688e-01 -1.21459961e-02 2.04101562e-01 -1.35742188e-01
9.13085938e-02 -5.50781250e-01 -7.81250000e-02 -4.22363281e-02
2.14843750e-01 -1.63085938e-01 1.20117188e-01 -5.95092773e-03
3.34472656e-02 -2.53906250e-01 3.02734375e-01 5.93261719e-02
2.18750000e-01 -5.00488281e-03 1.46484375e-01 -9.57031250e-02
-1.68945312e-01 2.52685547e-02 9.09423828e-03 1.01562500e-01
1.07910156e-01 -6.88476562e-02 2.91015625e-01 7.56835938e-03
-1.95312500e-01 -3.00781250e-01 -1.73828125e-01 2.89062500e-01
-1.13769531e-01 -1.27929688e-01 -2.59765625e-01 2.14843750e-01
2.61230469e-02 4.56542969e-02 -1.78710938e-01 -2.39257812e-01
2.19726562e-02 -2.26562500e-01 -3.35937500e-01 -6.15234375e-02
-4.14062500e-01 1.22070312e-01 -1.56250000e-01 6.64062500e-02
6.32812500e-01 9.42382812e-02 -2.77343750e-01 1.30859375e-01
-7.76367188e-02 1.77734375e-01 4.24804688e-02 -1.41601562e-01
-1.72851562e-01 -1.39648438e-01 1.77734375e-01 9.03320312e-02
-4.72656250e-01 -7.17773438e-02 -1.12792969e-01 1.42578125e-01
1.06445312e-01 1.31835938e-01 -2.32421875e-01 -3.78417969e-02
4.37500000e-01 -1.16699219e-01 -4.73632812e-02 2.10937500e-01
6.34765625e-02 -9.88769531e-03 -1.25976562e-01 -4.76074219e-03
-1.14257812e-01 1.13281250e-01 4.17480469e-02 -3.57421875e-01
2.27539062e-01 6.10351562e-02 -1.64062500e-01 4.46777344e-02
-4.53125000e-01 2.92968750e-01 -1.26953125e-01 2.04101562e-01
-3.41796875e-01 1.62109375e-01 1.68945312e-01 -1.95312500e-01
-1.32812500e-01 -3.69140625e-01 -1.73828125e-01 2.22656250e-01
-5.41992188e-02 9.03320312e-02 -8.00781250e-02 -9.27734375e-02
-9.81445312e-02 -1.60156250e-01 5.23437500e-01 8.05664062e-02
1.38671875e-01 1.97265625e-01 -4.04296875e-01 -4.04296875e-01
-9.08203125e-02 -2.62451172e-02 5.10253906e-02 3.90625000e-01
3.65234375e-01 -2.50000000e-01 -4.02343750e-01 5.20019531e-02
-1.11694336e-02 -3.47656250e-01 2.08007812e-01 -2.55859375e-01
4.90722656e-02 -2.65625000e-01 -1.24023438e-01 -3.26171875e-01
1.44531250e-01 -1.88476562e-01 -8.05664062e-02 -2.85156250e-01
1.33789062e-01 1.56250000e-01 3.27148438e-02 -2.81982422e-02
-1.05468750e-01 -4.56542969e-02 2.21679688e-01 3.22265625e-01
-2.06054688e-01 -1.20117188e-01 -9.37500000e-02 1.00585938e-01
1.90429688e-01 -3.63281250e-01 -2.75390625e-01 1.87500000e-01
1.63085938e-01 -7.51953125e-02 3.59375000e-01 1.75781250e-01
1.45507812e-01 2.69531250e-01 -3.43750000e-01 -2.92968750e-01
1.06445312e-01 3.29589844e-02 2.23632812e-01 2.59765625e-01
-2.71484375e-01 2.84423828e-02 2.11914062e-01 -4.68750000e-01
7.71484375e-02 -3.10546875e-01 -1.62109375e-01 4.17968750e-01]]
To change this
[ 1.56250000e-01 -2.18750000e-01 -2.81250000e-01 -2.26562500e-01
-8.78906250e-02 4.08203125e-01 1.66992188e-01 1.68457031e-02
3.20312500e-01 -5.22460938e-02 -3.19824219e-02 5.90820312e-02
1.32812500e-01 -1.55273438e-01 2.87109375e-01 1.30859375e-01
8.88671875e-02 4.56542969e-02 -4.37011719e-02 -4.96093750e-01
5.85937500e-02 1.35742188e-01 -1.81884766e-02 7.32421875e-02
-2.16796875e-01 5.66406250e-02 -1.66015625e-01 4.66308594e-02
-1.04492188e-01 1.65039062e-01 -2.61718750e-01 3.27148438e-02
-2.12890625e-01 3.63281250e-01 -2.80761719e-02 -2.40234375e-01
1.21093750e-01 2.75390625e-01 -4.49218750e-02 -1.44531250e-01
-1.61132812e-01 1.75781250e-01 1.69921875e-01 2.94921875e-01
3.41796875e-01 3.02124023e-03 -4.90722656e-02 9.88769531e-03
2.21679688e-01 -3.53515625e-01 -2.83203125e-01 -1.35742188e-01
-2.53906250e-02 -2.14843750e-01 1.51367188e-01 3.73046875e-01
4.68750000e-02 2.40478516e-02 -1.77734375e-01 2.49023438e-01
-5.85937500e-02 9.94873047e-03 -1.93359375e-01 1.38671875e-01
-1.30859375e-01 -4.34570312e-02 -5.54199219e-02 2.50000000e-01
-2.36328125e-01 3.12500000e-01 4.66796875e-01 -6.05468750e-02
2.25585938e-01 -2.26562500e-01 -4.21875000e-01 -2.11914062e-01
-1.37695312e-01 -1.56250000e-01 2.39257812e-02 -3.55468750e-01
-2.28515625e-01 3.82812500e-01 8.49609375e-02 4.00390625e-01
-4.19921875e-02 9.91210938e-02 6.74438477e-03 3.59375000e-01
9.42382812e-02 6.93359375e-02 -2.16796875e-01 -2.34375000e-01
-2.02148438e-01 9.94873047e-03 1.38671875e-01 1.24511719e-01
2.89062500e-01 1.10839844e-01 -6.49414062e-02 -4.08203125e-01
-4.37500000e-01 2.33154297e-02 1.02539062e-01 2.19726562e-01
-2.10937500e-01 9.47265625e-02 2.94921875e-01 3.14941406e-02
2.92968750e-01 -1.28906250e-01 -1.65039062e-01 -1.66992188e-01
1.49414062e-01 -1.44531250e-01 -1.22558594e-01 6.39648438e-02
-2.48046875e-01 -3.30078125e-01 3.93066406e-02 -1.69921875e-01
4.80957031e-02 -2.01171875e-01 -4.10156250e-01 1.65039062e-01
2.29492188e-01 -1.45507812e-01 -2.53906250e-01 -2.27539062e-01
1.33789062e-01 6.83593750e-02 -8.69140625e-02 4.15039062e-02
-1.96289062e-01 1.69921875e-01 6.54296875e-02 6.34765625e-02
1.22070312e-01 1.28906250e-01 -4.29687500e-02 -1.12792969e-01
1.22070312e-01 -1.85546875e-01 -2.89062500e-01 -2.91015625e-01
-1.98242188e-01 2.50000000e-01 2.46093750e-01 2.14843750e-01
-1.81640625e-01 6.98852539e-03 3.35937500e-01 -1.90429688e-01
-1.95312500e-02 4.23828125e-01 -1.04980469e-01 -7.81250000e-02
-1.31835938e-01 -1.21582031e-01 -1.50756836e-02 1.92382812e-01
9.03320312e-03 1.70898438e-01 2.83203125e-01 -1.08398438e-01
-5.07812500e-02 -4.58984375e-02 -1.44531250e-01 -1.38671875e-01
-3.55468750e-01 -6.44531250e-02 -1.21582031e-01 2.12402344e-02
-4.68750000e-01 2.34375000e-02 -1.48437500e-01 -1.45507812e-01
6.98242188e-02 -4.23828125e-01 -3.73535156e-02 1.45507812e-01
-3.10546875e-01 1.82617188e-01 1.31835938e-01 6.49414062e-02
3.04687500e-01 2.11914062e-01 -1.84570312e-01 -7.22656250e-02
1.39648438e-01 5.22460938e-02 -1.01562500e-01 -2.07031250e-01
-3.98437500e-01 -2.91015625e-01 -1.37695312e-01 3.76953125e-01
-1.37695312e-01 -1.91406250e-01 -4.05273438e-02 -5.61523438e-02
2.20947266e-02 3.93066406e-02 -3.39843750e-01 9.32617188e-02
3.75000000e-01 3.66210938e-02 -2.14843750e-02 -1.67968750e-01
-3.12500000e-01 1.81884766e-02 1.75781250e-01 -2.79541016e-02
-4.02343750e-01 -5.20019531e-02 8.88671875e-02 -5.00000000e-01
-1.25976562e-01 -9.37500000e-02 -2.85156250e-01 -1.66015625e-01
-1.92382812e-01 2.38281250e-01 -7.78198242e-03 2.00195312e-01
-1.07421875e-01 2.71484375e-01 -1.25000000e-01 -1.30462646e-03
-3.88183594e-02 -2.34375000e-01 -1.07910156e-01 4.68750000e-02
8.78906250e-03 -2.53906250e-01 6.78710938e-02 -6.15234375e-02
-5.20019531e-02 -9.96093750e-02 2.59765625e-01 -4.07714844e-02
1.03515625e-01 1.22680664e-02 -1.94335938e-01 -3.65234375e-01
2.57812500e-01 -4.19921875e-02 6.73828125e-02 1.73828125e-01
3.29589844e-02 -2.13867188e-01 4.83989716e-05 7.86132812e-02
7.86132812e-02 1.38671875e-01 1.07910156e-01 -4.10156250e-01
1.72851562e-01 -1.78710938e-01 6.56127930e-03 1.19628906e-01
2.55859375e-01 -4.76074219e-02 5.32226562e-02 -3.20312500e-01
-2.57812500e-01 1.11816406e-01 -1.67968750e-01 -2.73437500e-01
-4.19921875e-01 2.07031250e-01 8.10546875e-02 -1.47460938e-01
6.28662109e-03 -5.61523438e-02 -6.25610352e-03 1.25000000e-01
3.10058594e-02 2.83203125e-02 -3.90625000e-01 -1.04003906e-01
1.82617188e-01 4.61425781e-02 -1.11389160e-03 2.50000000e-01
1.77734375e-01 -1.07910156e-01 -3.97949219e-02 -1.62109375e-01
1.44653320e-02 4.49218750e-01 -1.12792969e-01 1.76757812e-01
-1.74804688e-01 -3.04687500e-01 3.94531250e-01 -5.93261719e-02
1.15234375e-01 4.05883789e-03 -2.12890625e-01 2.98828125e-01]
Need to do this in loop which is
`
def preprocess_and_vectorize(text):
# remove stop words and lemmatize the text
doc = nlp(text)
filtered_tokens = []
for token in doc:
if token.is_stop or token.is_punct:
continue
filtered_tokens.append(token.lemma_)
return wv[filtered_tokens]
This is suppose to be easy while using gensim attribute after return.
return wv.get_mean_vector(filtered_tokens)

how is the output of this nested loop being calculated?

Hi I have this calculation but I am failing to understand how this line [array([1050885., 1068309., 1085733., 1103157., 1120581.]) of the output is calculated, please explain.
creating sample data:
#creating sample data:
data1 = pd.DataFrame({"client": ['x1', 'x2'],
"cat": ['Bb', 'Ee'],
"amt": [1000,300],
"time":[2, 3],
"group":[10, 25]})
listc = ['Aa','Bb','Cc','Dd','Ee']
val1 = pd.DataFrame({'time': [1, 2, 3],
'lim %': [0.1, 0.11, 0.112]})
val2 = pd.concat([pd.DataFrame({'group':g, 'perc': 0.99, 'time':range(1, 11)}
for g in data1['group'].unique())]).explode('time')
mat = np.arange(75).reshape(3,5,5)
vals = [val1, val2]
data1['cat'] = pd.Categorical(data1['cat'],
categories=listc,
ordered=True).codes
for i in range(len(vals)):
if 'group' in vals[i].columns:
vals[i] = vals[i].set_index(['time', 'group'])
else:
vals[i] = vals[i].set_index(['time'])
#nested loop calculation
calc = {}
for client, cat, amt, start, group in data1.itertuples(name=None, index=False):
for time in range(start, len(mat)+1):
if time == start:
calc[client] = [[amt * mat[time-1, cat, :]]]
else:
calc[client].append([calc[client][-1][-1] # mat[time-1]])
for valcal in vals:
if isinstance(valcal.index, pd.MultiIndex):
value = valcal.loc[(time, group)].iat[0]
else:
value = valcal.loc[time].iat[0]
calc[client][-1].append(value * calc[client][-1][-1])
output:
{'x1': [[array([30000, 31000, 32000, 33000, 34000]),
array([3300., 3410., 3520., 3630., 3740.]),
array([3267. , 3375.9, 3484.8, 3593.7, 3702.6])],
[array([1050885., 1068309., 1085733., 1103157., 1120581.]), #how is this line calculated?
array([117699.12 , 119650.608, 121602.096, 123553.584, 125505.072]),
array([116522.1288 , 118454.10192, 120386.07504, 122318.04816,
124250.02128])]],
'x2': [[array([21000, 21300, 21600, 21900, 22200]),
array([2352. , 2385.6, 2419.2, 2452.8, 2486.4]),
array([2328.48 , 2361.744, 2395.008, 2428.272, 2461.536])]]}
what I need the calc for this line to be is:
[array([1050885., 1068309., 1085733., 1103157., 1120581.])
it should take array([3267. , 3375.9, 3484.8, 3593.7, 3702.6])] multiplied by mat at time 3, how can I get it to do this?

how can i smooth the graph values or extract main signals only

when i try to run the code below i get this graph
my code:
from numpy import nan
import json
import os
import numpy as np
import subprocess
import math
import matplotlib.pyplot as plt
from statistics import mean, stdev
def smooth(t):
new_t = []
for i, x in enumerate(t):
neighbourhood = t[max(i-2,0): i+3]
m = mean(neighbourhood)
s = stdev(neighbourhood, xbar=m)
if abs(x - m) > s:
x = ( t[i - 1 + (i==0)*2] + t[i + 1 - (i+1==len(t))*2] ) / 2
new_t.append(x)
return new_t
def outLiersFN(*U):
outliers=[] # after preprocessing list
#preprocessing Fc =| 2*LF1 prev by 1 - LF2 prev by 2 |
c0 = -2 #(previous) by 2 #from original
c1 =-1 #(previous) #from original
c2 =0 #(current) #from original
c3 = 1 #(next) #from original
preP = U[0] # original list
if c2 == 0:
outliers.append(preP[0])
c1+=1
c2+=1
c0+=1
c3+=1
oldlen = len(preP)
M_RangeOfMotion = 90
while oldlen > c2 :
if c3 == oldlen:
outliers.insert(c2, preP[c2]) #preP[c2] >> last element in old list
break
if (preP[c2] > M_RangeOfMotion and preP[c2] < (preP[c1] + preP[c3])/2) or (preP[c2] < M_RangeOfMotion and preP[c2] > (preP[c1] + preP[c3])/2): #Check Paper 3.3.1
Equ = (preP[c1] + preP[c3])/2 #fn of preprocessing # From third index # ==== inserting current frame
formatted_float = "{:.2f}".format(Equ) #with .2 number only
equu = float(formatted_float) #from string float to float
outliers.insert(c2,equu) # insert the preprocessed value to the List
c1+=1
c2+=1
c0+=1
c3+=1
else :
Equ = preP[c2] # fn of preprocessing #put same element (do nothing)
formatted_float = "{:.2f}".format(Equ) # with .2 number only
equu = float(formatted_float) # from string float to float
outliers.insert(c2, equu) # insert the preprocessed value to the List
c1 += 1
c2 += 1
c0 += 1
c3 += 1
return outliers
def remove_nan(list):
newlist = [x for x in list if math.isnan(x) == False]
return newlist
the_angel = [176.04, 173.82, 170.09, 165.3, 171.8, 178.3, 178.77, 179.24, 179.93, 180.0, 173.39, 166.78, 166.03, 165.28, 165.72, 166.17, 166.71, 167.26, 168.04, 167.22, 166.68, 166.13, 161.53, 165.81, 170.1, 170.05, 170.5, 173.01, 176.02, 174.53, 160.09, 146.33, 146.38, 146.71, 150.33, 153.95, 154.32, 154.69, 134.52, 114.34, 115.6, 116.86, 134.99, 153.12, 152.28, 151.43, 151.36, 152.32, 158.9, 166.52, 177.74, 178.61, 179.47, 167.44, 155.4, 161.54, 167.68, 163.96, 160.24, 137.45, 114.66, 117.78, 120.89, 139.95, 139.62, 125.51, 111.79, 112.07, 112.74, 110.22, 107.7, 107.3, 106.52, 105.73, 103.07, 101.35, 102.5, 104.59, 104.6, 104.49, 104.38, 102.81, 101.25, 100.62, 100.25, 100.15, 100.32, 99.84, 99.36, 100.04, 100.31, 99.14, 98.3, 97.92, 97.41, 96.9, 96.39, 95.88, 95.9, 95.9, 96.02, 96.14, 96.39, 95.2, 94.56, 94.02, 93.88, 93.8, 93.77, 93.88, 94.04, 93.77, 93.65, 93.53, 94.2, 94.88, 92.59, 90.29, 27.01, 32.9, 38.78, 50.19, 61.59, 61.95, 62.31, 97.46, 97.38, 97.04, 96.46, 96.02, 96.1, 96.33, 95.61, 89.47, 89.34, 89.22, 89.48, 89.75, 90.02, 90.28, 88.16, 88.22, 88.29, 88.17, 88.17, 94.98, 94.84, 94.69, 94.94, 94.74, 94.54, 94.69, 94.71, 94.64, 94.58, 94.19, 94.52, 94.85, 87.7, 87.54, 87.38, 95.71, 96.57, 97.11, 97.05, 96.56, 96.07, 95.76, 95.56, 95.35, 95.28, 95.74, 96.2, 96.32, 96.33, 96.2, 96.14, 96.07, 96.07, 96.12, 96.17, 96.28, 96.31, 96.33, 96.16, 96.05, 95.94, 95.33, 88.96, 95.0, 95.78, 88.19, 88.19, 88.19, 87.92, 87.93, 88.03, 87.94, 87.86, 87.85, 87.89, 88.08, 88.01, 87.88, 88.02, 88.15, 88.15, 88.66, 88.73, 88.81, 88.41, 88.55, 88.68, 88.69, 88.02, 87.35, 95.19, 95.39, 95.38, 95.37, 95.27, 95.17, 95.33, 95.32, 95.31, 95.37, 95.42, 95.34, 95.44, 95.53, 95.47, 95.41, 95.13, 94.15, 94.78, 97.64, 97.1, 96.87, 97.03, 96.76, 35.44, 23.63, 23.27, 24.71, 26.16, 96.36, 113.13, 129.9, 96.82, 63.74, 34.25, 33.42, 32.6, 30.69, 31.06, 31.43, 97.14, 97.51, 97.23, 98.54, 100.13, 100.95, 28.82, 33.81, 66.81, 99.82, 102.63, 101.9, 101.44, 102.19, 103.22, 103.67, 104.13, 104.07, 104.73, 105.46, 103.74, 102.02, 103.32, 102.59, 29.54, 28.08, 28.76, 29.79, 30.82, 113.51, 129.34, 145.16, 143.18, 148.29, 153.67, 166.14, 161.16, 151.64, 149.27, 146.9, 151.67, 153.02, 149.28, 145.53, 149.1, 152.67, 158.78, 164.89, 164.84, 164.8, 162.11, 159.42, 156.73, 156.28, 155.83, 156.4, 161.0, 165.59, 164.44, 159.73, 155.76, 156.97, 158.92, 159.15, 159.39, 159.99, 160.44, 160.88, 163.89, 166.9, 167.71, 167.11, 167.0, 167.44, 168.38, 153.16, 137.94, 137.65, 152.09, 169.49, 171.36, 173.22, 174.01, 174.0, 174.2, 174.41, 157.74, 141.09, 149.32, 157.57, 156.4, 148.4, 140.78, 141.06, 141.73, 143.05, 143.91, 156.59, 169.29, 172.17, 175.05, 175.29, 175.27, 175.15, 175.02, 174.81, 174.59, 174.76, 174.94, 175.18, 175.41, 175.23, 174.51, 174.64, 174.77, 174.56, 173.25, 172.38, 174.17, 176.4, 177.27, 177.29, 177.33, 178.64, 179.98, 179.99, 176.0, 172.88, 173.77, 173.8, 173.97, 174.72, 175.24, 176.89, 179.07, 179.27, 178.78, 178.29, 175.61, 174.21, 172.8, 173.05, 173.41, 173.77, 174.65, 175.52, 175.58, 176.15, 176.71, 159.12, 141.54, 141.12, 155.62, 170.53, 165.54, 160.71, 158.22, 156.35, 156.82, 158.55, 160.27, 161.33, 162.39, 162.37, 159.48, 156.59, 156.77, 158.05, 159.32, 158.49, 157.66, 157.7, 157.74, 158.44, 159.14, 150.13, 143.06, 136.0, 125.7, 115.41, 111.19, 106.97, 107.1, 107.24, 107.45, 107.67, 113.34, 119.01, 144.87, 170.73, 174.31, 177.89, 174.78, 171.67, 163.26, 134.58, 105.9, 102.98, 100.77, 101.05, 101.39, 101.73, 99.79, 98.71, 97.64, 97.8, 97.89, 96.67, 95.45, 94.33, 93.38, 92.44, 48.53, 91.4, 91.35, 91.34, 91.33, 90.92, 90.51, 88.63, 87.0, 86.74, 86.48, 96.79, 96.09, 95.46, 95.39, 94.32, 93.25, 93.31, 93.37, 93.11, 92.57, 93.41, 94.25, 96.48, 92.71, 88.94, 90.07, 90.43, 78.06, 77.69, 77.32, 90.1, 89.15, 89.14, 88.85, 88.38, 87.63, 121.2, 120.66, 86.89, 86.42, 85.69, 84.86, 84.86, 85.34, 85.82, 86.07, 86.32, 85.82, 85.32, 86.23, 86.69, 87.15, 87.04, 86.87, 86.58, 86.0, 85.41, 85.41, 85.53, 85.66, 85.7, 85.72, 85.75, 85.92, 86.09, 85.77, 85.45, 84.94, 85.55, 86.16, 86.21, 86.1, 85.77, 85.27, 84.56, 84.99, 85.38, 85.42, 85.98, 86.54, 86.5, 86.45, 86.56, 86.63, 86.35, 86.08, 85.82, 85.51, 85.21, 84.6, 84.84, 84.97, 85.1, 86.12, 86.88, 86.8, 86.46, 86.47, 87.23, 87.8, 88.0, 88.08, 88.16, 87.72, 87.63, 87.37, 86.42, 86.48, 87.24, 87.97, 88.09, 88.19, 88.32, 88.44, 87.82, 87.2, 86.03, 85.78, 91.5, 93.0, 88.2, 88.52, 88.42, 87.28, 85.73, 85.62, 85.5, 85.5, 87.06, 87.6, 88.1, 88.31, 88.53, 88.77, 89.14, 89.52, 89.46, 89.4, 90.28, 89.74, 91.28, 92.17, 92.16, 92.15, 93.08, 94.0, 94.66, 95.32, 94.13, 93.7, 93.32, 93.69, 94.58, 95.47, 97.25, 99.03, 99.63, 99.67, 99.71, 100.33, 101.58, 103.36, 103.49, 103.41, 106.31, 109.34, 109.28, 109.21, 107.76, 106.31, 105.43, 104.94, 104.44, 111.19, 117.93, 115.59, 113.24, 116.15, 119.06, 125.43, 140.72, 156.0, 161.7, 143.52, 135.33, 127.13, 127.68, 148.68, 169.68, 172.2, 174.72, 174.75, 174.66, 158.57, 142.63, 145.13, 153.29, 161.45, 163.34, 165.24, 162.25, 159.89, 159.07, 156.39, 155.21, 156.04, 159.29, 160.07, 160.85, 163.45, 162.93, 161.71, 160.06, 158.4, 144.74, 132.64, 134.57, 150.22, 165.86, 172.95, 174.12, 175.3, 175.5, 176.31, 177.71, 179.72, 168.13, 156.55, 146.24, 155.75, 176.0, 175.99, 175.98, 176.0, 176.02, 176.25, 175.13, 174.26, 173.38, 173.37, 173.46, 176.34, 174.55, 172.77, 168.45, 166.35, 166.47, 168.81, 167.43, 166.79, 167.35, 168.65, 168.51, 168.37, 168.88, 169.74, 171.19, 171.33, 169.91, 168.49, 167.11, 166.83, 167.01, 168.68, 170.34, 170.43, 172.15, 173.86, 177.62, 177.61, 175.34, 173.06, 176.47, 179.87, 179.9, 177.67, 175.67, 175.39, 175.36, 177.03, 176.0, 174.98, 174.96, 174.94, 175.76, 176.57, 169.05, 162.99, 164.97, 168.74, 172.51, 167.38, 165.08, 163.03, 163.81, 164.83, 164.81, 164.8, 165.88, 165.36, 159.61, 153.86, 153.57, 153.61, 153.65, 154.62, 155.58, 157.97, 156.35, 155.66, 154.98, 156.11, 157.24, 159.25, 159.6, 160.43, 161.26, 164.71, 168.17, 147.46, 126.92, 106.38, 105.23, 104.4, 105.37, 106.65, 109.21, 107.44, 104.65, 101.86, 102.35, 102.84, 102.79, 102.19, 101.59, 100.98, 100.38, 98.72, 97.73, 97.32, 96.9, 95.11, 93.97, 94.12, 94.12, 93.1, 92.08, 89.29, 90.35, 90.35, 90.35, 90.35, 86.95, 86.37, 86.06, 85.74, 94.56, 93.16, 92.46, 91.76, 88.55, 85.33, 87.52, 92.18, 93.68, 95.18, 94.4, 92.17, 89.94, 89.4, 89.37, 99.44, 100.98, 102.52, 103.18, 88.96, 88.23, 87.5, 85.2, 85.19, 86.87, 121.42, 155.96, 155.97, 155.97, 86.2, 86.5, 86.8, 87.22, 87.36, 87.34, 87.03, 87.04, 87.05, 86.36, 85.68, 85.71, 85.84, 85.93, 86.01, 86.04, 86.08, 85.92, 86.05, 86.18, 86.17, 86.19, 86.23, 86.22, 86.09, 85.92, 85.66, 85.69, 85.69, 85.31, 84.91, 84.93, 84.95, 84.93, 84.91, 84.9, 84.9, 84.9, 84.9, 85.38, 85.52, 85.66, 85.66, 85.4, 85.14, 85.47, 85.8, 85.72, 85.64, 86.09, 85.84, 85.27, 85.47, 85.66, 85.59, 85.52, 85.38, 85.39, 85.28, 85.17, 85.39, 85.7, 85.98, 86.26, 86.61, 92.97, 93.15, 86.58, 86.58, 86.53, 86.47, 98.55, 99.41, 100.16, 100.9, 89.19, 90.28, 91.38, 91.39, 91.4, 91.44, 92.05, 131.05, 170.63, 170.13, 162.43, 125.64, 88.85, 88.85, 99.08, 100.38, 101.69, 100.74, 99.79, 96.33, 93.31, 93.73, 94.87, 96.01, 96.93, 97.85, 98.97, 97.85, 98.14, 99.37, 102.01, 103.8, 105.58, 108.52, 108.12, 107.72, 106.75, 106.82, 109.08, 112.37, 112.52, 112.66, 112.97, 114.12, 115.64, 117.1, 118.57, 126.13, 133.69, 149.27, 163.96, 166.62, 169.27, 164.94, 160.61, 149.35, 141.18, 143.41, 143.57, 149.26, 157.49, 159.94, 151.93, 147.47, 145.97, 145.56, 145.15, 143.85, 142.54, 142.18, 142.43, 143.12, 144.41, 144.38, 151.99, 159.59, 174.81, 174.94, 175.84, 176.87, 162.41, 152.94, 151.59, 155.24, 155.22, 155.19, 155.04]
p0 = outLiersFN(smooth(remove_nan(the_angel)))
the_angel = p0
plt.plot(the_angel) #list(filter(fun, L1))
plt.show()
print((the_angel))
how can i smooth the values in (the_angel) to get graph like this (red line)
i mean ignoring all unnecessary and noisy values and get only main line instead
you can edit my code or suggest me new filter or algorithm
pandas has a rolling() method for dataframes that you can use to calculate the mean over a window of values, e.g. the 70 closest ones:
import pandas as pd
import matplotlib.pyplot as plt
WINDOW_SIZE = 70
the_angel = [176.04, 173.82, 170.09, 165.3, 171.8, # ...
]
df = pd.DataFrame({'the angel': the_angel})
df[f'mean of {WINDOW_SIZE}'] = df['the angel'].rolling(
window=WINDOW_SIZE, center=True).mean()
df.plot(color=['blue', 'red']);

trouble with my nested for loops achieve results similar to a sumif

I'm trying to cycle through 2 lists using for loops to calculate the sum for each unique reference. I suppose I'm looking for a pythonic sumif!
# list of data ("user_ID", "contract_Number", "weight", "type")
list1 = [
('1','261','6.2','Input'),
('1','262','7.2','Input'),
('1','263','5.2','Input'),
('1','264','8.2','Input'),
('1','261','3.2','Input'),
('1','262','2.2','Input'),
('1','262','7.2','Input'),
('1','263','4.2','Input'),
('1','264','6.2','Input'),
('1','265','6.2','Input'),
('1','261','9.2','Input'),
('1','261','10.2','Input')
]
contract_list = []
# create a list of contract numbers
for data_row in list1:
if data_row[0] == "1" and data_row[3] == "Input":
contract_list.append(data_row[1])
#remove duplication - left with a list of unique contract numbers
contract_list = list(dict.fromkeys(contract_list))
print(contract_list)
# I'm trying this...[28.6, 16.6, 9.4, 14.4, 6.2]
tally_list = []
tally = 0
for c in contract_list:
for l in list1:
if data_row[0] == '1' and data_row[1] == contract_list[0]:
tally = tally + float(data_row[2])
tally_list.append(tally)
print(tally_list)
I'm expecting...
['261', '262', '263', '264', '265']
[28.6, 16.6, 9.4, 14.4, 6.2]
I'm getting...
['261', '262', '263', '264', '265']
[122.40000000000002, 244.7999999999999, 367.19999999999976, 489.5999999999996, 612.0]
# I'm trying this...[28.6, 16.6, 9.4, 14.4, 6.2]
tally_list = []
tally = 0
for c in contract_list:
for l in list1: #<----------
if data_row[0] == '1' and data_row[1] == contract_list[0]:
tally = tally + float(data_row[2])
tally_list.append(tally)
In the marked row, it looks like you want to use the data_row variable instead of l
Actually, try this, you need to additionally reset tally and also use c instead of contract_list[0] in the final if statement.
# I'm trying this...[28.6, 16.6, 9.4, 14.4, 6.2]
tally_list = []
tally = 0
for c in contract_list:
for data_row in list1:
if data_row[0] == '1' and data_row[1] == c: #<----
tally = tally + float(data_row[2])
tally_list.append(tally)
tally=0 #<---
print(tally_list)
Just another approach using a defaultdict
from collections import defaultdict
list1 = [
('1','261','6.2','Input'),
('1','262','7.2','Input'),
('1','263','5.2','Input'),
('1','264','8.2','Input'),
('1','261','3.2','Input'),
('1','262','2.2','Input'),
('1','262','7.2','Input'),
('1','263','4.2','Input'),
('1','264','6.2','Input'),
('1','265','6.2','Input'),
('1','261','9.2','Input'),
('1','261','10.2','Input')
]
d = defaultdict(int)
for tup in list1:
if tup[0] == '1' and tup[3] == 'Input':
d[tup[1]] += float(tup[2])
contract_list = list(d)
print(contract_list)
tally_list = [format(v, '.1f') for v in d.values()]
print(tally_list)
Output:
['261', '262', '263', '264', '265']
['28.8', '16.6', '9.4', '14.4', '6.2']

Incorrect scikit-learn linear model prediction with date offset

I'm trying to predict time-series data, but by offsetting the result by date_offset-timepoints before training and prediction. The reason for doing this is to try and predict date_offset-timepoints into the future with the present data. See http://glowingpython.blogspot.co.za/2015/01/forecasting-beer-consumption-with.html for an example.
So in summary:
data = [1,2,3,4,5] should predict result = [2,3,4,5,6] if date_offset = 1
The results on the plot below show the red line being shifted by date_offset, and not predicting date_offset into the future. No matter how big I make date_offset, it keeps shifting and not predicting the last result I have, i.e. result = 5 (which is already know). In fact, the red line should not shift at all, just loose accuracy the bigger date_offset becomes. What am I doing wrong?
See example code and resulting image below:
from sklearn import linear_model
import matplotlib.pyplot as plt
import numpy as np
date_offset = 1
data = np.array([9330.0, 9470.0, 9550.0, 9620.0, 9600.0, 9585.0, 9600.0, 9600.0, 9430.0, 9460.0, 9450.0, 9650.0, 9620.0, 9650.0, 9500.0, 9400.0, 9165.0, 9100.0, 8755.0, 8850.0, 8990.0, 9150.0, 9195.0, 9175.0, 9250.0, 9200.0, 9350.0, 9280.0, 9370.0, 9470.0, 9445.0, 9440.0, 9280.0, 9325.0, 9170.0, 9270.0, 9200.0, 9450.0, 9510.0, 9371.0, 9499.0, 9499.0, 9400.0, 9500.0, 9550.0, 9670.0, 9700.0, 9760.0, 9767.4599999999991, 9652.0, 9520.0, 9600.0, 9610.0, 9700.0, 9825.0, 9900.0, 9950.0, 9801.0, 9770.0, 9545.0, 9630.0, 9710.0, 9700.0, 9700.0, 9600.0, 9615.0, 9575.0, 9500.0, 9600.0, 9480.0, 9565.0, 9510.0, 9475.0, 9600.0, 9400.0, 9400.0, 9400.0, 9300.0, 9430.0, 9410.0, 9380.0, 9320.0, 9000.0, 9100.0, 9000.0, 9200.0, 9210.0, 9251.0, 9460.0, 9400.0, 9600.0, 9621.0, 9440.0, 9490.0, 9675.0, 9850.0, 9680.0, 10100.0, 9900.0, 10100.0, 9949.0, 10040.0, 10050.0, 10200.0, 10400.0, 10350.0, 10200.0, 10175.0, 10001.0, 10110.0, 10400.0, 10401.0, 10300.0, 10548.0, 10515.0, 10475.0, 10200.0, 10481.0, 10500.0, 10540.0, 10559.0, 10300.0, 10400.0, 10202.0, 10330.0, 10450.0, 10540.0, 10540.0, 10650.0, 10450.0, 10550.0, 10501.0, 10206.0, 10250.0, 10345.0, 10225.0, 10330.0, 10506.0, 11401.0, 11245.0, 11360.0, 11549.0, 11415.0, 11450.0, 11460.0, 11600.0, 11530.0, 11450.0, 11402.0, 11299.0])
data = data[np.newaxis].T
results = np.array([9470.0, 9545.0, 9635.0, 9640.0, 9600.0, 9622.0, 9555.0, 9429.0, 9495.0, 9489.0, 9630.0, 9612.0, 9630.0, 9501.0, 9372.0, 9165.0, 9024.0, 8780.0, 8800.0, 8937.0, 9051.0, 9100.0, 9166.0, 9220.0, 9214.0, 9240.0, 9254.0, 9400.0, 9450.0, 9470.0, 9445.0, 9301.0, 9316.0, 9170.0, 9270.0, 9251.0, 9422.0, 9466.0, 9373.0, 9440.0, 9415.0, 9410.0, 9500.0, 9520.0, 9620.0, 9705.0, 9760.0, 9765.0, 9651.0, 9520.0, 9600.0, 9610.0, 9700.0, 9805.0, 9900.0, 9950.0, 9800.0, 9765.0, 9602.0, 9630.0, 9790.0, 9710.0, 9800.0, 9649.0, 9580.0, 9780.0, 9560.0, 9501.0, 9511.0, 9530.0, 9498.0, 9475.0, 9595.0, 9500.0, 9460.0, 9400.0, 9310.0, 9382.0, 9375.0, 9385.0, 9320.0, 9100.0, 8990.0, 9045.0, 9129.0, 9201.0, 9251.0, 9424.0, 9440.0, 9500.0, 9621.0, 9490.0, 9512.0, 9599.0, 9819.0, 9684.0, 10025.0, 9984.0, 10110.0, 9950.0, 10048.0, 10095.0, 10200.0, 10338.0, 10315.0, 10200.0, 10166.0, 10095.0, 10110.0, 10400.0, 10445.0, 10360.0, 10548.0, 10510.0, 10480.0, 10180.0, 10488.0, 10520.0, 10510.0, 10565.0, 10450.0, 10400.0, 10240.0, 10338.0, 10410.0, 10540.0, 10481.0, 10521.0, 10530.0, 10325.0, 10510.0, 10446.0, 10249.0, 10236.0, 10211.0, 10340.0, 10394.0, 11370.0, 11250.0, 11306.0, 11368.0, 11415.0, 11400.0, 11452.0, 11509.0, 11500.0, 11455.0, 11400.0, 11300.0, 11369.0])
# Date offset to predict next i-days results
data = data[:-date_offset]
results = results[date_offset:]
train_data = data[:-50]
train_results = results[:-50]
test_data = data[-50:]
test_results = results[-50:]
regressor = linear_model.BayesianRidge(normalize=True)
regressor.fit(train_data, train_results)
plt.figure(figsize=(8,6))
plt.plot(regressor.predict(test_data), '--', color='#EB3737', linewidth=2, label='Prediction')
plt.plot(test_results, label='True', color='green', linewidth=2)
plt.legend(loc='best')
plt.show()
First of all, the model is not really bad. For instance, when the real value is 10450, it predict 10350, which is really close. And, obviously, the farther in time the predicted point is, the less accurate its predictions, as the variance is growing and sometimes even bias is also growing. You cannot expect the opposite.
Secondly, it is a linear model, so it cannot be absolutely exact when the predicted variable is not linear by nature.
Thirdly, one have to choose a predicted variable with care. For instance, in this case you might try to predict not the value at time T, but the change in value at time T (i.e. C[T]=V[T]-V[T-1]) or the moving average of the last K values. Here you might (or, on the contrary, might not) find out that you are trying to model the so called "random walk" which is hard to predict exactly by its random nature.
And lastly, you might consider other models, like ARIMA, which are better suited for predicting time series.
Adding back the organize_data step:
import matplotlib.pyplot as plt
import numpy as np
import pandas as pd
from sklearn import linear_model
def organize_data(to_forecast, window, horizon):
"""
Input:
to_forecast, univariate time series organized as numpy array
window, number of items to use in the forecast window
horizon, horizon of the forecast
Output:
X, a matrix where each row contains a forecast window
y, the target values for each row of X
"""
shape = to_forecast.shape[:-1] + \
(to_forecast.shape[-1] - window + 1, window)
strides = to_forecast.strides + (to_forecast.strides[-1],)
X = np.lib.stride_tricks.as_strided(to_forecast,
shape=shape,
strides=strides)
y = np.array([X[i+horizon][-1] for i in range(len(X)-horizon)])
return X[:-horizon], y
data = np.array([9330.0, 9470.0, 9550.0, 9620.0, 9600.0, 9585.0, 9600.0, 9600.0, 9430.0, 9460.0, 9450.0, 9650.0, 9620.0, 9650.0, 9500.0, 9400.0, 9165.0, 9100.0, 8755.0, 8850.0, 8990.0, 9150.0, 9195.0, 9175.0, 9250.0, 9200.0, 9350.0, 9280.0, 9370.0, 9470.0, 9445.0, 9440.0, 9280.0, 9325.0, 9170.0, 9270.0, 9200.0, 9450.0, 9510.0, 9371.0, 9499.0, 9499.0, 9400.0, 9500.0, 9550.0, 9670.0, 9700.0, 9760.0, 9767.4599999999991, 9652.0, 9520.0, 9600.0, 9610.0, 9700.0, 9825.0, 9900.0, 9950.0, 9801.0, 9770.0, 9545.0, 9630.0, 9710.0, 9700.0, 9700.0, 9600.0, 9615.0, 9575.0, 9500.0, 9600.0, 9480.0, 9565.0, 9510.0, 9475.0, 9600.0, 9400.0, 9400.0, 9400.0, 9300.0, 9430.0, 9410.0, 9380.0, 9320.0, 9000.0, 9100.0, 9000.0, 9200.0, 9210.0, 9251.0, 9460.0, 9400.0, 9600.0, 9621.0, 9440.0, 9490.0, 9675.0, 9850.0, 9680.0, 10100.0, 9900.0, 10100.0, 9949.0, 10040.0, 10050.0, 10200.0, 10400.0, 10350.0, 10200.0, 10175.0, 10001.0, 10110.0, 10400.0, 10401.0, 10300.0, 10548.0, 10515.0, 10475.0, 10200.0, 10481.0, 10500.0, 10540.0, 10559.0, 10300.0, 10400.0, 10202.0, 10330.0, 10450.0, 10540.0, 10540.0, 10650.0, 10450.0, 10550.0, 10501.0, 10206.0, 10250.0, 10345.0, 10225.0, 10330.0, 10506.0, 11401.0, 11245.0, 11360.0, 11549.0, 11415.0, 11450.0, 11460.0, 11600.0, 11530.0, 11450.0, 11402.0, 11299.0])
train_window = 50
k = 5 # number of previous observations to use
h = 2 # forecast horizon
X,y = organize_data(data, k, h)
train_data = X[:train_window]
train_results = y[:train_window]
test_data = X[train_window:]
test_results = y[train_window:]
regressor = linear_model.BayesianRidge(normalize=True)
regressor.fit(train_data, train_results)
plt.figure(figsize=(8,6))
plt.plot(regressor.predict(X), '--', color='#EB3737', linewidth=2, label='Prediction')
plt.plot(y, label='True', color='green', linewidth=2)
plt.legend(loc='best')
plt.show()

Categories

Resources