Generator as Coroutines
- cooperative multitasking (cooperative routines)
- concurrent not parallel (python program execute on a single thread)
The way to create coroutines:
- generators (asyncio)
- native coroutines (using async /await)
- concurrency: tasks start, run and complete in overlapping time periods
- parallelism: tasks run simultaneousely
- cooperative: control relinquished to other task voluntarily, control by application(developer)
- preemptive: control relinquished to other task involuntarily, control by the OS.
some sort of scheduler involved
- Global Interpreter Lock(GIL)
Only one native thread excutes at a time.
Use Process based parallelism to avoid GIL. Not Thread based.
The Python threading module uses threads instead of processes. Threads uniquely run in the same unique memory heap. Whereas Processes run in separate memory heaps. This makes sharing information harder with processes and object instances. One problem arises because threads use the same memory heap, multiple threads can write to the same location in the memory heap which is why the global interpreter lock(GIL) in CPython was created as a mutex to prevent it from happening.
Make the right choice
- CPU Bound => Multi processing
- I/O Bound, Fast I/O, Limit Connections => Muilti Threading
- I/O Bound, Slow I/O, Many Connections => Concurrency
Much more efficient way to implement the stack and queue.
Operate 10,000 items take 1,000 times average:
|(times in seconds)||list||deque|
Use unlimited deque with
Use limited deque with
deque(maxlen=n). If full, a corresponding number of items are discarded from the opposite end.
Implement producer / consumer coroutine using deque
Implement simple event loop
- Created: invoke the generator function
- Running: calling next, will prime the generator at first time
- Suspended: until yielding
- Closed: until generator return
Send to Generator
yield is actually a expression, you can yield a value like
yield "hello", or receive values like
received = yield, or even more … combine both
received = yield "hello".
Notice that we can only send data when the generator is SUSPENDED at a yield.
Be careful mixing the two usage in your code:
- difficult to read
- sometimes useful
- often not needed
close() to raise GeneratorExit exception. The exception is:
- silenced by Python to the caller.
- not inherit from Exception
- must do the actual closing: return or raise an exception
It’s perfectly OK not to catch it, simplely let it bubble up.
Sending exception to generator
Can think of
throw() as same thing as
send(), but causes an exception to be sent instead of data
- generator catches the exception
- generator exits (returns)
- caller receives a StopIteration exception
We can throw custom exceptions to trigger some functionality
Sending and closing Sub-genrators
- Delegating sub-generators with
yield from, the sub-genrator can prime itself automatically in delegator
- If the
yield fromexpression recieve
StopIterationexception, will catch the return value from sub-generator as the result of the expression.
Delegate Generator Recursively