#+TITLE: The Asyncio Scheduler

If you read the PEP 3156 closely, you'll read a brief description of
the task scheduler introduced with it (emphasis mine).

The proposal includes a pluggable event loop (...) a higher-level
scheduler based on yield from (PEP 380). 
The scheduler is not pluggable; pluggability occurs at the event loop
level, and the standard scheduler implementation should work with any
conforming event loop implementation. (In fact this is an important
litmus test for conforming implementations.) 
For interoperability between code written using coroutines and other
async frameworks, the scheduler defines a Task class that behaves like
a Future. A framework that interoperates at the event loop level can
wait for a Future to complete by adding a callback to the
Future. Likewise, the scheduler offers an operation to suspend a
coroutine until a callback is called. 
The scheduler has no public interface. You interact with it by using
yield from future and yield from task. In fact, *there is no single
object representing the scheduler -- its behavior is implemented by
the Task and Future classes* using only the public interface of the 
event loop, so it will work with third-party event loop
implementations, too. 

The lack of details around the behavior of the scheduler as well as
the unexpected behavior described in Latency in Asynchronous Python
made me fairly curious about the internals of the scheduler: leading
to this post.

The heart of the scheduler revolves around Tasks: every time a Task is
- it's enqueued onto the current event loop to "step" with a call_soon
- and updates global state around what's going on with the world:
  maintained in a weakset with all alive tasks:
  asyncio.tasks._all_tasks and currently active tasks in a loop to
  task dictionary: asyncio.tasks._current_tasks.
The next interesting part is how it decides if a coroutine is finished,
should yield to allow something else to run, or is simply returning a
value that can return immediately.

I have to admit to being very confused the first time I traced a call
with an async generator that yielded values and finding out that the
outer coroutine never yielded. The magic is in
asyncio.tasks.Task.__step, which checks the return values from calling
coroutine.send(None) (called result):
- if things don't break: 
  - it checks result for the magic, duck typed attribute
    - If the attribute is set and True: the task is now await-ing
      result. A callback to restart the current parent task is added
      before leaving the task.
- if result is None, it's treated as relinquishing control to allow
  moving to the next task.
- otherwise it fails with StopIteration, which is saved (depending on how
  the task was canceled or not)
- or it fails with CancelledError and the task is canceled.
- or it simply fails with a general exception: which are also special
  cased -- KeyboardInterrupt and SystemExit are immediately
  propagated, the others are captured for later

misc links: https://docs.python.org/3/howto/clinic.html