Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Infect asyncio #121

Merged
merged 53 commits into from
Dec 17, 2021
Merged

Infect asyncio #121

merged 53 commits into from
Dec 17, 2021

Conversation

goodboy
Copy link
Owner

@goodboy goodboy commented Jun 29, 2020

README blurb summarizing this new feature.

What I hope to be a successful attempt at solving #120.

This gets us the following very interesting functionality:

  • ability to spawn an actor that has a process entry point of asyncio.run()
  • the asyncio actor embeds trio using the new guest mode which then enters the tractor.Actor._async_main() entry point thus engaging all the std IPC machinery
  • the actor is now able to accept requests for async functions which can utilize the new tractor.to_asyncio.run() method which allows spawning tasks in the host asyncio loop and waiting/streaming the result(s) back into the trio task
  • results can then be relayed upwards to the parent actor as per normal operation

It ended up being possible to accomplish per task error propagation by simply doing a little state passing inside the callback passed to asyncio.Task.add_done_callback() and thus not requiring the use of anyio. In fact I'm not even sure it's possible to do this with the task groups offered by anyio due to the nature of spawning asyncio tasks from trio tasks and providing an api to acquire the results on demand.

The main idea here is that a new tractor actor can be spawned and async functions that will run under trio can make calls to spawn asyncio tasks using the new tractor.to_asyncio.run() routine which acts basically like tractor.Portal.run() but without any IPC stuff going on underneath. All IPC is done as normal once results are back in the (embedded) trio task.

This basically makes tractor integration with asyncio frameworks a super cinch now :)

ping @njsmith @oremanj.
If either of you have time to look at this and tell if there's a more trionic way it'd be grealy appreciated 😸

Further things to be done:

Provisional test list:

  • call async func
  • [ ] call async generator and stream from it no longer supported
  • error from async func
  • error during async streaming
  • all the above in remote actors
  • [ ] @stream compatibility? unneeded with new api
  • [ ] trio_trio / from_trio injection don't member why i wrote this
  • simple asyncio task error propagation
  • trio cancels aio on child side
  • cancel via Portal.cancel_actor()
  • cancel via trio.move_on_after() around actor nursery
  • asyncio cancels itself and causes a trio cancellation
  • interloop channel streaming via to_asyncio.open_channel_from():
  • trio error kills interloop channel in SC style
  • trio closes interloop channel early and channel exits
  • asyncio error propagates to trio task with interloop channel
  • small trio-asyncio echo server

task = asyncio.create_task(_invoke(from_trio, to_trio, coro))
err = None

# XXX: I'm not sure this actually does anything...
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Old; it does.


# XXX: I'm not sure this actually does anything...
def cancel_trio(task):
"""Cancel the calling ``trio`` task on error.
Copy link
Owner Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

and save any error.

goodboy added a commit to pikers/piker that referenced this pull request Jul 4, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
@goodboy goodboy force-pushed the infect_asyncio branch 4 times, most recently from cea0b0b to 69dad96 Compare July 30, 2020 13:46
@goodboy goodboy changed the base branch from master to multiproc_debug July 30, 2020 13:58
@goodboy goodboy force-pushed the multiproc_debug branch 2 times, most recently from 62f74b9 to e483042 Compare August 13, 2020 18:43
goodboy added a commit to pikers/piker that referenced this pull request Sep 1, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
goodboy added a commit to pikers/piker that referenced this pull request Sep 2, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
goodboy added a commit to pikers/piker that referenced this pull request Sep 2, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
goodboy added a commit to pikers/piker that referenced this pull request Sep 2, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
goodboy added a commit to pikers/piker that referenced this pull request Sep 26, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
goodboy added a commit to pikers/piker that referenced this pull request Sep 29, 2020
Infected `asyncio` support is being added to `tractor` in
goodboy/tractor#121 so delegate to all that new machinery.

Start building out an "actor-aware" api which takes care of all the
`trio`-`asyncio` interaction for data streaming and request handling.
Add a little (shudder) method proxy system which can be used to invoke
client methods from another actor. Start on a streaming api in
preparation for real-time charting.
@goodboy goodboy force-pushed the infect_asyncio branch 2 times, most recently from 3dd4739 to 36ec668 Compare October 7, 2020 09:54
Pull the common `asyncio` -> `trio` error translation logic into
a common context manager and don't expect a final result to be captured
when using `open_channel_from()` since it's a manager interface and it
would be clunky to try and deliver some "final result" after exit.
Wraps the pairs of underlying `trio` mem chans and the `asyncio.Queue`
with this new composite which will be delivered from `open_channel_from()`.
This allows for both sending and receiving values from the `asyncio`
task (2 way msg passing) as well controls for cancelling or waiting on
the task.

Factor `asyncio` translation and re-raising logic into a new closure
which is run on both `trio` side error handling as well as on normal
termination to avoid missing `asyncio` errors even when `trio` task
cancellation is handled first.

Only close the `trio` mem chans on `trio` task termination *iff*
the task was spawned using `open_channel_from()`:
- on `open_channel_from()` exit, mem chan closure is the desired semantic
- on `run_task()` we normally only return a single value or error and
  if the channel is closed before the error is raised we may propagate
  a `trio.EndOfChannel` instead of the desired underlying `asyncio`
  task's error
For whatever reason `trio` seems to be swallowing this exception when
raised in the `trio` task so instead wrap it in our own non-base
exception type: `AsyncioCancelled` and raise that when the `asyncio`
task cancels itself internally using `raise <err> from <src_err>` style.

Further don't bother cancelling the `trio` task (via cancel scope)
since we we can just use the recv mem chan closure error as a signal
and explicitly lookup any set asyncio error.
Better encapsulate all the mem-chan, Queue, sync-primitives inside our
linked task channel in order to avoid `mypy`'s complaints about monkey
patching. This also sets footing for adding an `asyncio`-side channel
API that can be used more like this `trio`-side API.
@goodboy goodboy merged commit bbcdbaa into master Dec 17, 2021
@goodboy goodboy deleted the infect_asyncio branch December 17, 2021 16:02
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants