Skip to content

Generator as Coroutines

  • Python

Generator as Coroutines

  • cooperative multitasking (cooperative routines)
  • concurrent not parallel (python program execute on a single thread)

The way to create coroutines:

  • generators (asyncio)
  • native coroutines (using async /await)

Concepts

  • concurrency: tasks start, run and complete in overlapping time periods
  • parallelism: tasks run simultaneousely

image

  • cooperative: control relinquished to other task voluntarily, control by application(developer)
  • preemptive: control relinquished to other task involuntarily, control by the OS.

    some sort of scheduler involved

image

  • Global Interpreter Lock(GIL)

    Only one native thread excutes at a time.

    Use Process based parallelism to avoid GIL. Not Thread based.

    The Python threading module uses threads instead of processes. Threads uniquely run in the same unique memory heap. Whereas Processes run in separate memory heaps. This makes sharing information harder with processes and object instances. One problem arises because threads use the same memory heap, multiple threads can write to the same location in the memory heap which is why the global interpreter lock(GIL) in CPython was created as a mutex to prevent it from happening.

Make the right choice

  • CPU Bound => Multi processing
  • I/O Bound, Fast I/O, Limit Connections => Muilti Threading
  • I/O Bound, Slow I/O, Many Connections => Concurrency

Use deque

Much more efficient way to implement the stack and queue.

Operate 10,000 items take 1,000 times average:

(times in seconds) list deque
append(right) 0.87 0.87
pop(right) 0.002 0.0005
insert(left) 20.8 0.84
pop(left) 0.012 0.0005

Use unlimited deque with deque() or deque(iterable)
Use limited deque with deque(maxlen=n). If full, a corresponding number of items are discarded from the opposite end.

Implement producer / consumer coroutine using deque

Implement simple event loop

Generator Concepts

Generator State

  • Created: invoke the generator function
  • Running: calling next, will prime the generator at first time
  • Suspended: until yielding
  • Closed: until generator return

Send to Generator

yield is actually a expression, you can yield a value like yield "hello", or receive values like received = yield, or even more … combine both received = yield "hello".

image

Notice that we can only send data when the generator is SUSPENDED at a yield.

Be careful mixing the two usage in your code:

  • difficult to read
  • sometimes useful
  • often not needed

Closing generator

Calling close() to raise GeneratorExit exception. The exception is:

  • silenced by Python to the caller.
  • not inherit from Exception
  • must do the actual closing: return or raise an exception
    It’s perfectly OK not to catch it, simplely let it bubble up.

Sending exception to generator

Can think of throw() as same thing as send(), but causes an exception to be sent instead of data

  1. generator catches the exception
  2. generator exits (returns)
  3. caller receives a StopIteration exception

We can throw custom exceptions to trigger some functionality

Sending and closing Sub-genrators

  • Delegating sub-generators with yield from, the sub-genrator can prime itself automatically in delegator
  • If the yield from expression recieve StopIteration exception, will catch the return value from sub-generator as the result of the expression.

Delegate Generator Recursively

發佈留言

發佈留言必須填寫的電子郵件地址不會公開。 必填欄位標示為 *