BishopPhillips
BishopPhillips
BPC Home BPC Python Topic Home BPC RiskManager BPC SurveyManager BPC RiskWiki Learn HTML 5 and CSS Enquiry

Python Techniques - Generators

Creating & Using Generators.

Author: Jonathan Bishop
AI Revolution

Generators in Python are a type of iterable that allow us to produce values on-the-fly, one at a time, instead of computing them all at once and storing them in memory. They are an effective way to write memory-efficient and concise code, especially for processing large or even infinite data streams. Here's our comprehensive guide to using and forming them:

 

1. What Are Generators?

Generators are most commonly a type of iterable, similar to lists, but they don’t store all the values in the sequence in memory. Instead, they generate values as you iterate over them. Although the most common use of generators is as an iterator, there are a number of other kinds of generators that are not essentially iterators like context managers, message passing, dataflow pipes, cooperative multitaskers, etc. In this article we will start with iterating generators and then look into the other kinds of generators.

Almost everyone programming in Python 3 makes regular use of at least one generator: the range() routine which generates a sequence of numbers most frequently used in for-loops! Since Python 3 the range() routine has been a generator (it was a list maker in Python 2 and rangex() was its generator cousin).

As an iterator, a generator is a lazy, on demand evaluator that generates values in a sequence on demand rather creating all the values at once when called. It creates one value and then returns control to the calling routine and waits to be summoned again to create the next value in the sequence. When used as the iterable in a for-loop, the for-loop implicitly calls the next() of the generator with each iteration of the loop until the generator runs out of values to return and raises a "stopiteration" exception and which point the loop ends.

  • Generators are defined in one of two ways:

    1. Generator Functions: Created using the yield keyword within a function.

    2. Generator Expressions: Created using an expression similar to list comprehensions but with parentheses instead of square brackets.

Every list or dictionary comprehension can be converted into a generator expression by simply replacing the [] or {} with ().

1.1 Generator Function Syntax

Generator functions follow this general syntax:

 
def generator_name(parameters):
    # Optional setup code
    while condition:  # A loop or logic for generating values
        yield expression  # Generate a value
        # Optional code to update or manage state
  • Key Characteristics:

  • yield:

    • Used to produce values one at a time, pausing the function.

    • Allows the generator to resume from where it left off when next() is called.

  • State Retention:

    • Keeps track of local variables between calls to next().

Example:

# Traditional loop
def count_up_to(n):
    current = 1
    while current <= n:
        yield current
        current += 1

# Create the generator
gen = count_up_to(5)

# Use next() to retrieve values one at a time
print(next(gen))  # Output: 1 (First value yielded)
print(next(gen))  # Output: 2 (Second value yielded)
print(next(gen))  # Output: 3 (Third value yielded)
 
Explanation:
  1. Create the Generator:

    • The generator gen is created from the count_up_to(5) function (not executed! - see section 5 below for an explanation)

    • It will yield numbers starting from 1 up to 5.

  2. Call next():

    • Each time you call next(gen), the generator resumes execution from where it left off.

    • It continues the while loop and yields the next number, incrementing current until the loop ends.

  3. What Happens After the Last Value:

    • After yielding the last value (5), calling next(gen) again raises a StopIteration exception, signaling that the generator is exhausted.

 1.2. Generator Expression Syntax 

Generator expressions use parentheses to define generators in a concise manner:

python

(generator_expression for variable in iterable if condition)
Key Characteristics:
  • Expression:

    • The computation to perform on each variable (e.g., x**2).

  • for variable in iterable:

    • Iterates through the values in the iterable (e.g., range(10)).

  • Optional if condition:

    • Filters the values in the iterable (e.g., if x % 2 == 0).

 
Example:


# Generator expression for squares
squares = (x**2 for x in range(10))

# Use next() to retrieve values one at a time
print(next(squares))  # Output: 0 (0 squared)
print(next(squares))  # Output: 1 (1 squared)
print(next(squares))  # Output: 4 (2 squared)
print(next(squares))  # Output: 9 (3 squared)
print(next(squares))  # Output: 16 (4 squared)

Explanation:
  1. Generator Expression:

    • squares = (x**2 for x in range(10)) creates a generator that lazily computes the square of numbers from 0 to 9.

  2. Calling next():

    • Each call to next(squares) retrieves the next value from the generator and evaluates the next square.

    • The generator maintains its internal state, so the sequence proceeds from where it left off.

  3. Iteration Stops:

    • After all values from the generator have been retrieved, calling next(squares) again will raise a StopIteration exception, as the generator is exhausted.

 

1.3 General Syntax and Flow

Generator Type Syntax Flow
Generator Function def gen(): yield value Produces one value per yield.
Generator Expression (expression for item in iterable if ...) Iterates lazily, one item at a time.

 

2. How Generators Work

Overview

Generators work by maintaining their state between calls and resuming execution where they left off. This is possible because of the yield keyword, which pauses the function and temporarily returns a value.

Example: Generator Function

def simple_generator():
    yield 1
    yield 2
    yield 3

gen = simple_generator()
print(next(gen))  # Output: 1
print(next(gen))  # Output: 2
print(next(gen))  # Output: 3

Explanation:
  • The assignment to "gen" does not execute the generator (even though it looks like a conventional function result assignment), but instead assigns a generator object to gen, which is subsequently executed when "next(gen)" is called.  We will delve into this behaviour in more detail below.

  • yield pauses the function, saving the current state and local variables.

  • When next() is called, the function resumes execution from the point after yield.

 

A Detailed View Of How Generators Work

Let's consider another example in detail:


def infinite_sequence():
    i = 0
    while True:
        yield i
        i += 1

gen = infinite_sequence()
print(next(gen))  # Output: 0
print(next(gen))  # Output: 1

 
What Happens When You Assign gen = infinite_sequence()?

When the generator function infinite_sequence() is called, it doesn't actually execute the code inside the function. Instead, it creates a generator object and prepares it to start execution at the point of the first yield.

Key points:

  • No Execution Yet:

    • When you assign gen = infinite_sequence(), Python creates a generator object but does not execute the function body.

    • The function does not start running until you explicitly begin iterating over the generator using next() or a loop.

 
What Happens When You Call next(gen)?
  1. First next(gen):

    • The function starts executing from the beginning (i = 0).

    • It proceeds to the first yield statement: yield i.

    • At this point, the generator pauses and i (which is 0) is returned to the caller.

    Why the First Yield?

    • yield does two things:

      • It produces a value (i) for the caller.

      • It pauses the function, saving its current state (including the value of i and its position in the code).

  2. Subsequent next(gen):

    • The function resumes right after the yield statement.

    • i += 1 is executed, incrementing i to 1.

    • The function then hits the next yield statement, producing 1 and pausing again.

 
Key Insight: Why Doesn't the Generator Execute on Assignment?

Generators are lazy. The code inside the generator function is only executed on demand, meaning:

  • When the generator object gen is created, Python simply prepares it to run but doesn’t execute anything yet.

  • The first next(gen) triggers the generator to start executing its code, stopping at the first yield.

 
Step-by-Step Breakdown of our Example

def infinite_sequence():
    i = 0
    while True:
        yield i  # Pauses and returns current value of i
        i += 1   # Resumes from here on the next call

gen = infinite_sequence()  # Creates the generator object, but no execution yet

print(next(gen))  # Executes the function until the first yield
# Execution: i = 0, yield 0 → Output: 0

print(next(gen))  # Resumes after the first yield
# Execution: i += 1 → i = 1, yield 1 → Output: 1
 
Visualizing the State of the Generator

Here’s what happens during each call to next():

State Action Value Returned
Initial (gen created) Generator is prepared, no execution yet. -
First next(gen) Executes i = 0, stops at yield i. 0
Second next(gen) Resumes at i += 1, stops at yield i. 1
Third next(gen) Resumes at i += 1, stops at yield i. 2
 
Why This Makes Generators Useful

This lazy behavior is one of the key advantages of generators:

  • They don’t compute values upfront; instead, they compute values as they are requested.

  • This makes generators memory-efficient and ideal for working with large datasets or infinite sequences.

 
 

3. Generator Expressions

A generator expression is a concise way to create a generator without using the def keyword. When you look at the example below, you will note that structurally they are indistinguishable from comprehension structure (lists, sets, dictionaries, etc.) EXCEPT that they have round brackets (like a tuple). I'll come back to this in a second, but for now note that this implies that when we are creating a list comprehension etc. we are actually creating an generator expression which is then encased in square brackets (the list constructor) which then iterates the generator to completion during construction.

Example:


gen = (x**2 for x in range(5))
print(next(gen))  # Output: 0
print(next(gen))  # Output: 1

 

Ok, so the generator expression works similarly to the generator function, except that it doesn't have a yield statement but uses a conventional iterator and otherwise follows the comprehension syntax. (See our comprehensive article on comprehensions, here!) Never-the-less, it follows the same basic idea assigning the generator object to the variable, then generating one iteration of the iterator expression and yielding the result to the calling "next()" function.

You don't always have to assign the generator expression to a variable to use it. Here is an example of a recursive generator using a expression generator as an iterator. It scans directories looking for any instance of a target directory and returns a list of the paths.


import pathlib as pl

tpath="."
tdir="Test2"

def finddir(stpath, target_dir):
    for file in (file for file in pl.Path(stpath).iterdir() if file.is_dir()):       
        if file.name == target_dir: yield file.as_posix()            
        yield from finddir(file, target_dir)
        
print(list(finddir(tpath, tdir)))

 

Note the generator expression: "(file for file in pl.Path(stpath).iterdir() if file.is_dir())" is being used as an iterator which is a way to insert a filter (the if condition) into a for loop. As the finddir routine is itself a generator, we need to use "yield from" on the return value of the finddir call to yield the value to the previous calling routine that the recursively called routine itself yields. The actual value we are yielding from the routine is in the "yield file.as_posix()" expression which returns the target directory's path as a forward slash "/" denominated file path (just because I don't like the "\\" that is inserted for windows paths). We look at "yield from" again a bit later.

By inserting the generator function finddir() in a list() call we are wrapping a list constructor around the yielded values. The constructor is designed to iterate through its argument if given an iterator, thus it iterates the generator through the entire directory tree making a list until the iterator/generator is exhausted. If we had instead done [findir(tpath, tdir)] we would have had a list that contained a single generator object primed for generation, but not actually generated. Using the list() constructor causes it to be iterated over during construction.

Now, I said I would return to the issue of tuples and comprehensions, so let us consider that issue now. The expected comprehension syntax for a tuple has been used for generators, so the way we can build something like a tuple comprehension, is to explicitly convert a generator expression to a tuple using the tuple() constructor:


squared_tuple = tuple(x**2 for x in range(5))
print(squared_tuple)  # Output: (0, 1, 4, 9, 16)

 

Why Use Generator Expressions?

  • They’re memory-efficient, as values are generated lazily.

  • Ideal for one-off use cases.

  • As a way to insert a filter into a for loop at the iteration step.

 

4. Key Features of Generators

  1. Lazy Evaluation:

    • Values are generated only when needed versus lists, sets, tuples, etc. which generate all entries first.

    • Example:

      
      # Items are generated individually and only when 
      # the print statement calls for them to be generated.
      
      def count_up_to(n):
          i = 1
          while i <= n:
              yield i
              i += 1
      
      for number in count_up_to(5):
          print(number)  # Output: 1, 2, 3, 4, 5
      
      
      
  2. Memory Efficiency:

    • Since generators yield values one at a time, they use significantly less memory than lists.

    • Example:

      
      big_list = [x**2 for x in range(10**6)]  # Requires memory for the entire list
      big_gen  = (x**2 for x in range(10**6))  # Memory-efficient - requires memory for one item
      
              
  3. State Retention:

    • Generators remember their state between calls, so each subsequent "next()" call resumes from the line after the yield statement with all local variables retaining their last state. This is effectively synchronous concurrency.

  4. Infinite Sequences:

    • Generators can be used to create infinite sequences, but on demand (one sequence step at a time).

    • Example:

      
      def infinite_sequence():
          i = 0
          while True:
              yield i
              i += 1
      
      gen = infinite_sequence()
      print(next(gen))  # Output: 0
      print(next(gen))  # Output: 1
      
      
    • Note in the above example that we are assigning the generator function to a variable "gen". If you are reading closely, you might think the output is a little odd - shouldn't it generate 1, then 2, etc in the print statements, not 0 then 1? We are going to examine this example more closely in the next section.
 

5. Use Cases for Generators

  1. Processing Large Files:

    • Generators can process large files line by line without loading the entire file into memory.

    • Example:

      def read_large_file(file_path):
          with open(file_path, 'r') as file:
              for line in file:
                  yield line.strip()
      
      for line in read_large_file('large_file.txt'):
          print(line)
      
  2. Pipelining Data Processing:

    • Generators can be chained together to create efficient data pipelines.

    • Example:

      def generator1():
          for i in range(10):
              yield i
      
      def generator2(gen):
          for value in gen:
              yield value ** 2
      
      pipeline = generator2(generator1())
      for result in pipeline:
          print(result)  # Outputs squares of 0 through 9
      
  3. Implementing Iterators:

    • Generators are an easy way to implement custom iterators.

  4. Lazy Combinatorics:

    • Generators can handle combinations, permutations, and cartesian products efficiently.

 

6. Comparison Between Generators and Lists

Feature Generator List
Memory Usage Memory-efficient, generates values lazily. Stores all values in memory at once.
Evaluation Lazy: Produces one value at a time. Eager: Computes all values immediately.
Use Case Suitable for large/infinite sequences. Suitable for small/medium-sized datasets.
 

7. Advanced Generator Techniques

Generator Delegation with yield from:

The yield from statement is used to delegate part of a generator’s operations to another generator.

Example:

def generator1():
    yield 1
    yield 2

def generator2():
    yield 3
    yield from generator1()  # Delegates to generator1
    yield 4

for value in generator2():
    print(value)  # Output: 3, 1, 2, 4
 

Generator with send():

Generators can receive data from the outside using the send() method.

Example:

def echo():
    while True:
        value = yield
        print(value)

gen = echo()
next(gen)         # Start the generator
gen.send("Hello") # Output: Hello
gen.send("World") # Output: World
 

Using throw() and close():

  • throw(): Injects an exception into the generator.

  • close(): Stops the generator and cleans up resources.

 

8. Generators as Context Managers

We will now look briefly at a totally different application of generators. So far we have looked at generators as lazy iterators, but when decorated with @contextmanager a generator becomes a kind of "context manager" generator. To explore this idea we need to review our understanding of context managers. Forgive me while I take a short detour from the topic of generators.

You might be familiar with the "with" statement in Python as an alternative way to manage entry and exit in file handling:


	print("File is not open here")
	with open("myfile.txt", "rt")as f:
	  print("File is open in here")
		mybuf = f.read()
		
	print("File is closed again here.")
 

This is an example of a context manager in action. The file is opened on entry to the "with" block and held open for the duration of the block. On exit from the block it is automatically closed without the need for an explicit f.close() call. This works because the file object implements a "context manager" and the "with" statement supports the concept of a runtime context manager which is implemented using a pair of methods of a class:

  • __enter__() : Enter a runtime context and return either this object or another object related to the runtime context which will be bound to the identifier of the with statement (f in the example above)
  • __exit__(exc_type, exc_val, exc_tb): Exit the runtime context and return a boolean flag indicating that if an exception occurred it should be suppressed.

Context managers are defined in the contextlib. A class can implement the context manager protocol be implementing these two methods. Thus any suitable class can be designed to work with the "with" statement. For example this is a complete implementation of a FileWriter class implementing the context manager protocol enabling it to be used in a "with" statement:


class FileWriter:
    def __init__(self, filename, mode='w'):
        self.filename = filename
        self.mode = mode
        self.file = None

    def __enter__(self):
        print(f"Opening file: {self.filename}")
        self.file = open(self.filename, self.mode)
        return self.file  # This is assigned to the variable in the 'with' block

    def __exit__(self, exc_type, exc_val, exc_tb):
        print(f"Closing file: {self.filename}")
        if self.file:
            self.file.close()
        # Returning False lets exceptions propagate if any occurred
        return False

# Usage
with FileWriter("example.txt") as f:
    f.write("Hello from your custom context manager!\n")
 

Generators can be designed to deliver a very similar outcome without the need for a decicated class. This is achieved by using the @contextmanager decorator. When this decorator is used the __enter__() portion is done in the part of the function preceeding the yield statement and the __exit__() is everything that follows the yield statement. In this case following example that corresponds to the "finally" block of the "try" statement. For example:


import os
from contextlib import contextmanager

@contextmanager
def change_dir(destination):
    prev_dir = os.getcwd()
    try:
        os.chdir(destination)
        yield
    finally:
        os.chdir(prev_dir)

# Usage
with change_dir("/tmp"):
    print("Current directory:", os.getcwd())

print("Back to original directory:", os.getcwd())

 

In the above example we are changing the current directory for the duration of the "with" block to whatever the directory is in the calling parameter (in this case "/tmp") and then restoring the directory back to the previous working directory on exit from the with block

Let's look at another example. We will now build a timer as a context manager implementation. Our timer is designed to enable the measurement of the execution time of a block of code. We are going to implement it as both a class and a generator. Lets first look at the class version:


import time

class CodeTimer:
    def __init__(self, name="Block"):
        self.name = name
        self.start = None

    def __enter__(self):
        self.start = time.perf_counter()
        print(f"[{self.name}] starting...")
        return self  # Optional, but allows access to duration in `with` block

    def __exit__(self, exc_type, exc_val, exc_tb):
        end = time.perf_counter()
        self.duration = end - self.start
        print(f"[{self.name}] finished in {self.duration:.4f} seconds.")
        
with CodeTimer("Matrix multiplication") as t:
    result = [[i*j for j in range(1000)] for i in range(1000)]
# Access t.duration if needed

 

Now lets look at this same beast as a decorated generator:


import time
from contextlib import contextmanager

#A generator of context!  Anything (including nothing) can be yielded.
#Here we yield a callable (elapsed() which will give the elapsed time)
#but we could have yielded the time object or even a dictionary containing
#the start time: {"start":start}.  The yielded value will be assigned by the "as"
#keyword.  If we just had "yield" with no value, we would omit the "as" keyword
#from the "with" statement.  The generator pauses at the yield statement until
#the "with" block finshes, and then resumes.

@contextmanager
def code_timer(name="Block"):
    start = time.perf_counter()
    
    def elapsed():
    	return time.perf_counter() - start
    	
    print(f"[{name}] starting...")
    try:
        yield elapsed # Yielding a value makes it assignable via "as" (optional)
    finally:
        end = time.perf_counter()
        duration = end - start
        print(f"[{name}] finished in {duration:.4f} seconds.")
        
with code_timer("Matrix multiplication") as t:
    result = [[i*j for j in range(1000)] for i in range(1000)]

 

Both of these implementations achieve the same thing, but the second is lighter as it does not require the construction of an entire class.

Before we leave this topic we will consider one more example. We can use this strategy to impose a temporary file lock for the duration of the with block (and one could see how this could be extended to mutex shared memory or variables in concurrent threads)


import fcntl
from contextlib import contextmanager

@contextmanager
def file_lock(path, mode='r+', timeout=5, retry_interval=0.1):
    with open(path, mode) as f:
        while True:
            try:
                fcntl.flock(f, fcntl.LOCK_EX | fcntl.LOCK_NB) #Non-blocking lock attempt
                print(f"Acquired lock on {path}")
                yield f
                break
            except BlockingIOError:
                elapsed = time.time() - start_time
                if elapsed >= timeout:
                    raise TimeoutError(f"Failed to acquire lock on {path} after {timeout} seconds.")
                time.sleep(retry_interval)
            finally:
                try:
                    fcntl.flock(f, fcntl.LOCK_UN)
                    print(f"Released lock on {path}")
                except Exception:
                    pass

lock_file = "/tmp/my_sim.lock"

with file_lock(lock_file, 'w+', timeout=3, retry_interval=0.2) as f:
    f.write("Running critical section...\n")
    f.flush()
    # Simulate a protected operation
    import time
    time.sleep(2)
except TimeoutError as e:
		print(e)
except RunTimeError as e:
		print(e)
except Exception as e:
    print(f"Unknown exception: {e}")


 

Note that "fcntl.flock()" is advisory as it only works if all cooperating processes honour it. Here we yield the locked file itself so the calling process can read or write to it within the block.

 

9. Generator as Coroutine Receiver

A generator function can be designed to act as a coroutine awaiting messages with the yield statement turned into a receiver of data sent from outside the function:


def grep(pattern):
    print(f"Looking for: {pattern}")
    try:
        while True:
            line = (yield)
            if pattern in line:
                print(f">> {line}")
    except GeneratorExit:
        print("Going offline...")

g = grep("python")    # Request the generator object and set the target
next(g)               # Prime the generator receiver
g.send("hello world") # Send a message
g.send("python generators are powerful")  #Send another message
g.close()             # Send the close message

 

We would use this capability in dispatcher, reactive pipelines and streaming filters.

 

10. Generator as Pipeline or Dataflow Graph

Generators can be chained into elegant dataflow streams:


def numbers():
    for i in range(10):
        yield i

def squared(source):
    for value in source:
        yield value ** 2

def even_only(source):
    for value in source:
        if value % 2 == 0:
            yield value

for x in even_only(squared(numbers())):
    print(x)

 

We would use this capability for modular ETL pipelines, and lazy evaluation in data transformation.

 

11. Recursive Generators

Generators can be used recursively to make a lazy recursive function. For example we can flatten nested lists lazilly:


def walk(nested):
    for x in nested:
        if isinstance(x, list):
            yield from walk(x)
        else:
            yield x


 

We would use this capability for flattening data trees, traversing ASTs, and directory structures, etc.

 

12. Lightweight State Machine

Generators are a natural fit for finite state machine mechanics enabling implementation without the need to build classes:


def traffic_light():
    while True:
        yield "Green"
        yield "Yellow"
        yield "Red"

light = traffic_light()
for _ in range(6):
    print(next(light)
 

Finite state machines have a very wide range of applications from process simulation & control, through game logic, simulations and protocol handlers. Every interpreter is a kind of finite state machine.

 

13. Population Simulator Demonstrating Multiple Generators

As a final step (at least, until I think of something else to explore) I will demonstrate a number of these specialised generators in one application. Here we build a population simulator that grows a population of sims who live in blocks of two pods to a block, splitting into a a new block everytime the population in any one pod gets to 8 or more. We represent the blocks as lists of up to two pods or blocks, so as the populating grows it builds a tree of lists of lists.

This is the first version of the code, which I am very much still working on, but I have had to move onto another project for a few days, so come back in a week when I have had time to get back to it. This code works and demonstrates the interplay of generators but I want to add more and make the population flow a little more interesting:


from contextlib import contextmanager
import random

# --------------------------------------
# Recursive count of all cells
# --------------------------------------
def count_population(tree):
    for node in tree:
        if isinstance(node, list):
            yield from count_population(node)
        else:
            yield node

# --------------------------------------
# FSM generator for state transitions
# --------------------------------------
def sim_state_machine():
    states = ["idle", "growing", "dividing", "dieing"]
    i = 0
    while True:
        yield states[i % len(states)]
        i += 1

# --------------------------------------
# Context manager for logging
# --------------------------------------
@contextmanager
def sim_logger():
    print("SimTree log started")
    yield lambda msg: print(f"LOG: {msg}")
    print("SimTree log closed")

# --------------------------------------
# Coroutine-style simulation engine
# --------------------------------------
def simtree_engine(threshold=10):
    tick = 0
    state_gen = sim_state_machine()
    tree = [random.randint(2, 5)]  # Root cell

    while True:
        state = next(state_gen)

        # Division step: recursively rebuild tree
        def divide(node):
            if isinstance(node, list):
                return [divide(child) for child in node]
            elif node > threshold:
                split = node // 2
                return [split, node - split]
            else:
                return node

        if state == "dividing":
            tree = divide(tree)

        # Compute total population
        total = sum(count_population(tree))

        # Yield simulation state
        control = yield {
            "tick": tick,
            "state": state,
            "population": total,
            "structure": tree
        }

        tick += 1
        # In "growing" state, increment every leaf
        if state == "growing":
            def grow(node):
                if isinstance(node, list):
                    return [grow(child) for child in node]
                else:
                    return node + random.randint(1, node)
            tree = grow(tree)

        # In "dieing" state, increment every leaf
        if state == "dieing":
            def die(node):
                if isinstance(node, list):
                    return [die(child) for child in node]
                else:
                    return node - random.randint(0, node//2) if node > 0 else 0
            tree = die(tree)

def run_simtree(ticks=12):
    sim = simtree_engine(threshold=8)
    next(sim)

    with sim_logger() as log:
        for _ in range(ticks):
            result = sim.send(None)
            log(result)
            print(f">>> Tick {result['tick']:02d} | State: {result['state']:>9} | "
                  f"Pop: {result['population']:>3} | Tree: {result['structure']}")

run_simtree(ticks=40)
	
 

It produces an output similar to:

 

SimTree log started
LOG: {'tick': 1, 'state': 'growing', 'population': 5, 'structure': [5]}
>>> Tick 01 | State:   growing | Pop:   5 | Tree: [5]
LOG: {'tick': 2, 'state': 'dividing', 'population': 9, 'structure': [[4, 5]]}
>>> Tick 02 | State:  dividing | Pop:   9 | Tree: [[4, 5]]
LOG: {'tick': 3, 'state': 'dieing', 'population': 9, 'structure': [[4, 5]]}
>>> Tick 03 | State:    dieing | Pop:   9 | Tree: [[4, 5]]
LOG: {'tick': 4, 'state': 'idle', 'population': 7, 'structure': [[2, 5]]}
>>> Tick 04 | State:      idle | Pop:   7 | Tree: [[2, 5]]
LOG: {'tick': 5, 'state': 'growing', 'population': 7, 'structure': [[2, 5]]}
>>> Tick 05 | State:   growing | Pop:   7 | Tree: [[2, 5]]
LOG: {'tick': 6, 'state': 'dividing', 'population': 10, 'structure': [[4, 6]]}
>>> Tick 06 | State:  dividing | Pop:  10 | Tree: [[4, 6]]
LOG: {'tick': 7, 'state': 'dieing', 'population': 10, 'structure': [[4, 6]]}
>>> Tick 07 | State:    dieing | Pop:  10 | Tree: [[4, 6]]
LOG: {'tick': 8, 'state': 'idle', 'population': 10, 'structure': [[4, 6]]}
>>> Tick 08 | State:      idle | Pop:  10 | Tree: [[4, 6]]
LOG: {'tick': 9, 'state': 'growing', 'population': 10, 'structure': [[4, 6]]}
>>> Tick 09 | State:   growing | Pop:  10 | Tree: [[4, 6]]
LOG: {'tick': 10, 'state': 'dividing', 'population': 18, 'structure': [[7, [5, 6]]]}
>>> Tick 10 | State:  dividing | Pop:  18 | Tree: [[7, [5, 6]]]
LOG: {'tick': 11, 'state': 'dieing', 'population': 18, 'structure': [[7, [5, 6]]]}
>>> Tick 11 | State:    dieing | Pop:  18 | Tree: [[7, [5, 6]]]
LOG: {'tick': 12, 'state': 'idle', 'population': 15, 'structure': [[7, [4, 4]]]}
>>> Tick 12 | State:      idle | Pop:  15 | Tree: [[7, [4, 4]]]
LOG: {'tick': 13, 'state': 'growing', 'population': 15, 'structure': [[7, [4, 4]]]}
>>> Tick 13 | State:   growing | Pop:  15 | Tree: [[7, [4, 4]]]
LOG: {'tick': 14, 'state': 'dividing', 'population': 26, 'structure': [[[6, 7], [6, 7]]]}
>>> Tick 14 | State:  dividing | Pop:  26 | Tree: [[[6, 7], [6, 7]]]
LOG: {'tick': 15, 'state': 'dieing', 'population': 26, 'structure': [[[6, 7], [6, 7]]]}
>>> Tick 15 | State:    dieing | Pop:  26 | Tree: [[[6, 7], [6, 7]]]
LOG: {'tick': 16, 'state': 'idle', 'population': 22, 'structure': [[[5, 6], [5, 6]]]}
>>> Tick 16 | State:      idle | Pop:  22 | Tree: [[[5, 6], [5, 6]]]
LOG: {'tick': 17, 'state': 'growing', 'population': 22, 'structure': [[[5, 6], [5, 6]]]}
>>> Tick 17 | State:   growing | Pop:  22 | Tree: [[[5, 6], [5, 6]]]
LOG: {'tick': 18, 'state': 'dividing', 'population': 28, 'structure': [[[6, 7], [8, 7]]]}
>>> Tick 18 | State:  dividing | Pop:  28 | Tree: [[[6, 7], [8, 7]]]
LOG: {'tick': 19, 'state': 'dieing', 'population': 28, 'structure': [[[6, 7], [8, 7]]]}
>>> Tick 19 | State:    dieing | Pop:  28 | Tree: [[[6, 7], [8, 7]]]
LOG: {'tick': 20, 'state': 'idle', 'population': 17, 'structure': [[[3, 4], [6, 4]]]}
>>> Tick 20 | State:      idle | Pop:  17 | Tree: [[[3, 4], [6, 4]]]
LOG: {'tick': 21, 'state': 'growing', 'population': 17, 'structure': [[[3, 4], [6, 4]]]}
>>> Tick 21 | State:   growing | Pop:  17 | Tree: [[[3, 4], [6, 4]]]
LOG: {'tick': 22, 'state': 'dividing', 'population': 24, 'structure': [[[4, 5], [[5, 5], 5]]]}
>>> Tick 22 | State:  dividing | Pop:  24 | Tree: [[[4, 5], [[5, 5], 5]]]
LOG: {'tick': 23, 'state': 'dieing', 'population': 24, 'structure': [[[4, 5], [[5, 5], 5]]]}
>>> Tick 23 | State:    dieing | Pop:  24 | Tree: [[[4, 5], [[5, 5], 5]]]
LOG: {'tick': 24, 'state': 'idle', 'population': 19, 'structure': [[[2, 5], [[5, 3], 4]]]}
>>> Tick 24 | State:      idle | Pop:  19 | Tree: [[[2, 5], [[5, 3], 4]]]
LOG: {'tick': 25, 'state': 'growing', 'population': 19, 'structure': [[[2, 5], [[5, 3], 4]]]}
>>> Tick 25 | State:   growing | Pop:  19 | Tree: [[[2, 5], [[5, 3], 4]]]
LOG: {'tick': 26, 'state': 'dividing', 'population': 33, 'structure': [[[4, 8], [[[4, 5], 6], 6]]]}
>>> Tick 26 | State:  dividing | Pop:  33 | Tree: [[[4, 8], [[[4, 5], 6], 6]]]
LOG: {'tick': 27, 'state': 'dieing', 'population': 33, 'structure': [[[4, 8], [[[4, 5], 6], 6]]]}
>>> Tick 27 | State:    dieing | Pop:  33 | Tree: [[[4, 8], [[[4, 5], 6], 6]]]
LOG: {'tick': 28, 'state': 'idle', 'population': 26, 'structure': [[[2, 8], [[[2, 4], 4], 6]]]}
>>> Tick 28 | State:      idle | Pop:  26 | Tree: [[[2, 8], [[[2, 4], 4], 6]]]
LOG: {'tick': 29, 'state': 'growing', 'population': 26, 'structure': [[[2, 8], [[[2, 4], 4], 6]]]}
>>> Tick 29 | State:   growing | Pop:  26 | Tree: [[[2, 8], [[[2, 4], 4], 6]]]
LOG: {'tick': 30, 'state': 'dividing', 'population': 39, 'structure': [[[4, [5, 6]], [[[3, 8], 5], 8]]]}
>>> Tick 30 | State:  dividing | Pop:  39 | Tree: [[[4, [5, 6]], [[[3, 8], 5], 8]]]
LOG: {'tick': 31, 'state': 'dieing', 'population': 39, 'structure': [[[4, [5, 6]], [[[3, 8], 5], 8]]]}
>>> Tick 31 | State:    dieing | Pop:  39 | Tree: [[[4, [5, 6]], [[[3, 8], 5], 8]]]
LOG: {'tick': 32, 'state': 'idle', 'population': 28, 'structure': [[[4, [5, 3]], [[[2, 6], 3], 5]]]}
>>> Tick 32 | State:      idle | Pop:  28 | Tree: [[[4, [5, 3]], [[[2, 6], 3], 5]]]
LOG: {'tick': 33, 'state': 'growing', 'population': 28, 'structure': [[[4, [5, 3]], [[[2, 6], 3], 5]]]}
>>> Tick 33 | State:   growing | Pop:  28 | Tree: [[[4, [5, 3]], [[[2, 6], 3], 5]]]
LOG: {'tick': 34, 'state': 'dividing', 'population': 45, 'structure': [[[5, [7, 4]], [[[4, [6, 6]], 5], 8]]]}
>>> Tick 34 | State:  dividing | Pop:  45 | Tree: [[[5, [7, 4]], [[[4, [6, 6]], 5], 8]]]
LOG: {'tick': 35, 'state': 'dieing', 'population': 45, 'structure': [[[5, [7, 4]], [[[4, [6, 6]], 5], 8]]]}
>>> Tick 35 | State:    dieing | Pop:  45 | Tree: [[[5, [7, 4]], [[[4, [6, 6]], 5], 8]]]
LOG: {'tick': 36, 'state': 'idle', 'population': 36, 'structure': [[[5, [4, 2]], [[[3, [6, 4]], 5], 7]]]}
>>> Tick 36 | State:      idle | Pop:  36 | Tree: [[[5, [4, 2]], [[[3, [6, 4]], 5], 7]]]
LOG: {'tick': 37, 'state': 'growing', 'population': 36, 'structure': [[[5, [4, 2]], [[[3, [6, 4]], 5], 7]]]}
>>> Tick 37 | State:   growing | Pop:  36 | Tree: [[[5, [4, 2]], [[[3, [6, 4]], 5], 7]]]
LOG: {'tick': 38, 'state': 'dividing', 'population': 60, 'structure': [[[[4, 5], [8, 3]], [[[5, [[5, 5], 6]], [4, 5]], [5, 5]]]]}
>>> Tick 38 | State:  dividing | Pop:  60 | Tree: [[[[4, 5], [8, 3]], [[[5, [[5, 5], 6]], [4, 5]], [5, 5]]]]
LOG: {'tick': 39, 'state': 'dieing', 'population': 60, 'structure': [[[[4, 5], [8, 3]], [[[5, [[5, 5], 6]], [4, 5]], [5, 5]]]]}
>>> Tick 39 | State:    dieing | Pop:  60 | Tree: [[[[4, 5], [8, 3]], [[[5, [[5, 5], 6]], [4, 5]], [5, 5]]]]
LOG: {'tick': 40, 'state': 'idle', 'population': 42, 'structure': [[[[2, 5], [6, 2]], [[[3, [[4, 3], 3]], [2, 4]], [5, 3]]]]}
>>> Tick 40 | State:      idle | Pop:  42 | Tree: [[[[2, 5], [6, 2]], [[[3, [[4, 3], 3]], [2, 4]], [5, 3]]]]
SimTree log closed	

 

14. Limitations of Generators

  1. One-Time Iteration:

    • Generators can only be iterated once. If you need to iterate again, you have to recreate the generator.

  2. No Random Access:

    • You can’t access elements by index like you can with lists.

  3. Debugging Complexity:

    • Debugging generators can be tricky because values are computed lazily.

 

15. Practical Guidelines

  • Use iterating generators for large datasets or streams of data.

  • Use list() or tuple() to materialize a generator when you need all the values at once.

  • Combine generators with itertools for powerful data processing pipelines.

  • Combine generators with @contextmanager to make non-iterating context management generators.