I have a function that creates all combinations of list items, representing this as a list of lists:
def makeCombos(arr):
yield (sum([map(list, combinations(arr, i)) for i in range(len(arr) + 1)], []))
Calling makeCombos([1,2,3,4,5]) gives me a generator object, but calling .next() does not give me one combo at a time, it gives me the entire list of combos.
How can I turn this into a generator function that I can call?
sum(iterable, []) doesn't create a list of lists. It actually flattens things.
yield (sum(...)) in this line you're just yielding a single item, the flattened list of all combinations.
For Python 2.X sum([(map(list, combinations(arr, i))) ...]) will work, but in Python 3.X map no longer returns a list. Instead it returns a map object. So, if anyones on Python 3.X simply turn this into list(map(.....)) for this to run on 3.X.
I think what you actually want is something like this:
from itertools import combinations
def makeCombos(arr):
for i in range(len(arr) + 1):
for combo in map(list, combinations(arr, i)):
yield combo
#Or call next
combos = makeCombos([1, 2, 3, 4, 5])
for combo in combos:
print combo
An alternative from the comment(s) for a one-liner:
Instead of yielding we can return a generator object and cycle through just as we would with the yield.
e.g. -
from itertools import combinations
def makeCombos(arr):
return (combo for i in range(len(arr) + 1) for combo in map(list, combinations(arr, i)))
combos = makeCombos([1, 2, 3, 4, 5])
....
As for this being "Pythonic" I wouldn't really say so. I actually prefer the nested forloop it is by far more readable.
Although, we can still try to clean it up some more / compact it by doing a few "tricks"
from itertools import combinations as cs #or some other name)
def makeCombos(arr):
return (c for i in range(len(arr) + 1) for c in map(list, cs(arr, i)))
But, now you've lost all readability and this looks like something you'd see in Perl. (the horror!)
Output:
[]
[1]
[2]
[3]
[4]
[5]
[1, 2]
[1, 3]
[1, 4]
[1, 5]
[2, 3]
[2, 4]
[2, 5]
[3, 4]
[3, 5]
[4, 5]
[1, 2, 3]
[1, 2, 4]
[1, 2, 5]
[1, 3, 4]
[1, 3, 5]
[1, 4, 5]
[2, 3, 4]
[2, 3, 5]
[2, 4, 5]
[3, 4, 5]
[1, 2, 3, 4]
[1, 2, 3, 5]
[1, 2, 4, 5]
[1, 3, 4, 5]
[2, 3, 4, 5]
[1, 2, 3, 4, 5]
itertools already has a method for joining iterables together: it's called chain. What you want is something like the following:
def makeCombos(arr):
return chain.from_iterable(combinations(arr, i) for i in range(len(arr) + 1))
Simple, short, and fairly Pythonic in my opinion.
Related
An example:
list = [[2, 1, 2, 3, 4],
[0, 4, 5],
[1, 8, 9]]
So the first index inside a nested list decides which following numbers will be put into an unnested list.
[2, 1, 2, 3, 4] -> 2: so 1 and 2 gets picked up
[0, 4, 5] -> 0: no number gets picked up
[1, 8, 9] -> 1; number 8 gets picked up
Output would be:
[1, 2, 8]
This is what I have so far:
def nested_list(numbers):
if isinstance(numbers[0], list):
if numbers[0][0] > 0:
nested_list(numbers[0][1:numbers[0][0] + 1])
else:
numbers = list(numbers[0])
return numbers + nested_list(numbers[1:])
I try to get the list through recursion but something is wrong. What am I missing or could this be done even without recursion ?
You try using list comprehension with tuple unpacking here.
[val for idx, *rem in lst for val in rem[:idx]]
# [1, 2, 8]
NB This solution assumes you would always have a sub-list of size 1 or greater. We can filter out empty sub-lists using filter(None, lst)
list1=[[2, 1, 2, 3, 4],
[0, 4, 5],
[1, 8, 9]]
list2= []
for nested_list in list1:
for i in range(nested_list[0]):
list2.append(nested_list[i+1])
You can try List-comprehension:
>>> [sub[i] for sub in lst for i in range(1, sub[0]+1) ]
[1, 2, 8]
PS: The solution expects each sublist to be a non-empty list, else it will throw IndexError exception due to sub[0].
Another list comprehension
sum([x[1:x[0] + 1] for x in arr], [])
# [1, 2, 8]
Using builtin function map to apply the picking function, and using itertools.chain to flatten the resulting list of list:
def pick(l):
return l[1:1+l[0]]
ll = [[2, 1, 2, 3, 4], [0, 4, 5], [1, 8, 9]]
print( list(map(pick, ll)) )
# [[1, 2], [], [8]]
print( list(itertools.chain.from_iterable((map(pick, ll)))) )
# [1, 2, 8]
Or alternatively, with a list comprehension:
ll = [[2, 1, 2, 3, 4], [0, 4, 5], [1, 8, 9]]
print( [x for l in ll for x in l[1:1+l[0]]] )
# [1, 2, 8]
Two important notes:
I've renamed your list of lists ll rather than list. This is because list is already the name of the builtin class list in python. Shadowing the name of a builtin is very dangerous and can have unexpected consequences. I strongly advise you never to use the name of a builtin, when naming your own variables.
For both solutions above, the error-handling behaves the same: exception IndexError will be raised if one of the sublists is empty (because we need to access the first element to know how many elements to pick, so an error is raised if there is no first element). However, no exception will be raised if there are not enough elements in one of the sublists. For instance, if one of the sublists is [12, 3, 4], then both solutions above will silently pick the two elements 3 and 4, even though they were asked to pick 12 elements and not just 2. If you want an exception to be raised for this situation, you can modify function pick in the first solution:
def pick(l):
if len(l) == 0 or len(l) <= l[0]:
raise ValueError('in function pick: two few elements in sublist {}'.format(l))
return l[1:1+l[0]]
ll = [[2, 1, 2, 3, 4], [0, 4, 5], [1, 8, 9], [12, 3, 4]]
print( [x for l in ll for x in l[1:1+l[0]]] )
# [1, 2, 8, 3, 4]
print( [x for l in ll for x in pick(l)] )
# ValueError: in function pick: two few elements in sublist [12, 3, 4]
So I have a function which takes a variable number of lists as an argument, then combines those lists into one single list:
def comb_lists(*lists):
sublist = []
for l in lists:
sublist.extend(l)
print(sublist)
>>> comb_lists([1, 2], [3, 4], [5, 6])
[1, 2, 3, 4, 5, 6]
And it works. But I was just wondering if there was a simpler solution? I tried a list comprehension using list unpacking, but that returned a SyntaxError:
def comb_lists(*lists):
sublist = [*l for l in lists]
>>> comb_lists([1, 2], [3, 4], [5, 6])
SyntaxError: iterable unpacking cannot be used in comprehension
Is there any neater or quicker way to do this?
EDIT: itertools looks really useful for this sort of thing. I'd be interested to know if there's any way of doing it that doesn't rely on imports though.
here is the simplest solution
result = sum(lists, [])
There's built-in function chain.form_iterable() in itertools module to do this:
>>> from itertools import chain
>>> my_list = [[1, 2], [3, 4], [5, 6]]
>>> list(chain.from_iterable(my_list))
[1, 2, 3, 4, 5, 6]
If you do not want to import any module, you can write nested list comprehension to achieve this as:
>>> my_list = [[1, 2], [3, 4], [5, 6]]
>>> [e for l in my_list for e in l]
[1, 2, 3, 4, 5, 6]
I am attempting to insert a list of points into a deque, but I'm having trouble keeping it a continuous list:
from collections import deque
pts= deque()
pts = [1, 5]
new_pts = [2, 3, 4]
pts.insert(1,new_pts)
Output:
[1,[2, 3, 4], 5]
Desired output:
[1,2,3,4,5]
This works:
[pts.insert(1,pt) for pt in reversed(new_pts)]
But I'm afraid I'm overcomplicating things.
You can use simple slicing:
pts[1:1] = new_pts
Code:
pts = [1, 5]
new_pts = [2, 3, 4]
pts[1:1] = new_pts
print(pts)
# [1, 2, 3, 4, 5]
You need to use extendleft() to add more than one value to the beginning of the queue.
pts.extendleft(new_pts)
I have the foll. list in python:
[1, 2, 3, 4]
Is there an python itertools function that results in foll:
[1]
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
This is trivial without itertools:
def fall(it):
ls = []
for x in it:
ls.append(x)
yield ls
for x in fall(xrange(20)):
print x
Note that this works with any iterable, not just a list.
If you still want itertools, something like this should work (py3):
for x in itertools.accumulate(map(lambda x: [x], it)):
print(x)
Again, it's lazy and works with any iterable.
There isn't anything in itertools, that I can think of, but this should work:
def incremental(L):
for i in range(1, len(L)+1):
yield L[:i]
Output:
In [53]: print(*incremental([1, 2, 3, 4]), sep='\n')
[1]
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
This can be written as one-liner using list comprehension:
>>> [ list[:x+1] for x in range(len(list)) ]
[[1], [1, 2], [1, 2, 3], [1, 2, 3, 4]]
If it is must to use itertools, you may use itertools.islice as:
from itertools import islice
my_list = [1, 2, 3, 4]
for i in range(1, len(my_list)+1):
print list(islice(my_list, i))
However there is absolutely no need to use itertools here. You may achieve this via simple list slicing as:
for i in range(len(my_list)):
print my_list[:i+1]
Both of the above solutions will print the result as:
[1]
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
One line solution, using partial and islice,
from itertools import islice
from functools import partial
my_list = [1, 2, 3, 4]
[list(l) for l in map(partial(islice, my_list), range(1,len(my_list)+1))]
you get,
[[1], [1, 2], [1, 2, 3], [1, 2, 3, 4]]
in other words,
from itertools import islice
from functools import partial
my_list = [1, 2, 3, 4]
p = partial(islice, my_list)
for i in range(1,5):
print(list(p(i)))
you get,
[1]
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
You don't need itertools, just use a map:
>>> l = [1, 2, 3, 4]
>>> for sub_list in map(lambda index: l[:index + 1], range(len(l))):
... print sub_list
I want to generate or return an append-accumulated list from a given list (or iterator). For a list like [1, 2, 3, 4], I would like to get, [1], [1, 2], [1, 2, 3] and [1, 2, 3, 4]. Like so:
>>> def my_accumulate(iterable):
... grow = []
... for each in iterable:
... grow.append(each)
... yield grow
...
>>> for x in my_accumulate(some_list):
... print x # or something more useful
...
[1]
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
This works but is there an operation I could use with itertools.accumulate to facilitate this? (I'm on Python2 but the pure-python implementation/equivalent has been provided in the docs.)
Another problem I have with my_accumulate is that it doesn't work well with list(), it outputs the entire some_list for each element in the list:
>>> my_accumulate(some_list)
<generator object my_accumulate at 0x0000000002EC3A68>
>>> list(my_accumulate(some_list))
[[1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]
Option 1:
I wrote my own appending accumulator function to use with itertools.accumulate but considering the LoC and final useful-ness, it seems like a waste of effort, with my_accumulate being more useful, (though may fail in case of empty iterables and consumes more memory since grow keeps growing):
>>> def app_acc(first, second):
... if isinstance(first, list):
... first.append(second)
... else:
... first = [first, second]
... return first
...
>>> for x in accumulate(some_list, app_acc):
... print x
...
1
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
>>> list(accumulate(some_list, app_acc)) # same problem again with list
[1, [1, 2, 3, 4], [1, 2, 3, 4], [1, 2, 3, 4]]
(and the first returned elem is not a list, just a single item)
Option 2: Figured it would be easier to just do incremental slicing but using the ugly iterate over list length method:
>>> for i in xrange(len(some_list)): # the ugly iterate over list length method
... print some_list[:i+1]
...
[1]
[1, 2]
[1, 2, 3]
[1, 2, 3, 4]
The easiest way to use accumulate is to make each item in the iterable a list with a single item and then the default function works as expected:
from itertools import accumulate
acc = accumulate([el] for el in range(1, 5))
res = list(acc)
# [[1], [1, 2], [1, 2, 3], [1, 2, 3, 4]]