Python Iterators and Generators
What is an Iterator in Python?
In Python, an iterator is an object that allows sequential access to elements in a collection without using indexing. An object is an iterator if it implements the following two methods:
- __iter__() → returns the iterator object itself
- __next__() → returns the next item or raises StopIteration when items are exhausted
Example: Custom Iterator in Python
class CountUpTo:
def __init__(self, max):
self.max = max
self.current = 1
def __iter__(self):
return self
def __next__(self):
if self.current <= self.max:
num = self.current
self.current += 1
return num
else:
raise StopIteration
counter = CountUpTo(3)
for num in counter:
print(num)
Output:
1 2 3
Built-in Iterators
Many built-in types like lists, tuples, sets, strings, etc., are iterable and can be converted to iterators using the iter() function.
my_list = [10, 20, 30]
it = iter(my_list)
print(next(it)) # 10
print(next(it)) # 20
print(next(it)) # 30
# print(next(it)) # Raises StopIteration
What is a Generator in Python?
A generator is a simplified iterator. It allows you to create an iterator in a much more concise way using the yield keyword. Generators maintain their state between calls and do not store the entire sequence in memory — making them memory-efficient.
Example: Generator Function in Python
def count_up_to(max):
current = 1
while current <= max:
yield current
current += 1
gen = count_up_to(3)
for num in gen:
print(num)
Output:
1 2 3
This behaves like an iterator without needing __iter__() and __next__() methods.
Generator vs Iterator in Python
Feature | Iterator | Generator |
---|---|---|
Syntax | Class-based | Function-based with yield |
Memory Usage | Can be large | Very memory-efficient |
Boilerplate Code | More (define __iter__ and __next__) | Less (just use yield) |
Use Case | Complex custom iteration logic | Lightweight, one-time use iterations |
Generator Expression (Like List Comprehension)
Just like list comprehensions, you can create a generator in one line using parentheses ():
gen_exp = (x*x for x in range(4))
for num in gen_exp:
print(num)
Output:
0 1 4 9
Advantages of Generators
- Lazy evaluation (values are computed on demand)
- Low memory footprint
- Ideal for processing large data streams
- Easily pause and resume execution
Real-World Example: Reading Large File Using Generator
def read_large_file(file_path):
with open(file_path, "r") as file:
for line in file:
yield line.strip()
# for line in read_large_file("bigdata.txt"):
# process(line)