Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

First draft IVar-Future-Promise-Probe unification #176

Closed
wants to merge 12 commits into from

Conversation

pitr-ch
Copy link
Member

@pitr-ch pitr-ch commented Oct 23, 2014

Not ready for merge. Opening discussion. Compatibility layer is totally missing for now.

@chrisseaton
Copy link
Member

Does this link to some discussion? What's the background and motivation for this?

@pitr-ch
Copy link
Member Author

pitr-ch commented Oct 23, 2014

It relates to #139, just put a #139 (comment) there.

@iNecas
Copy link
Contributor

iNecas commented Oct 23, 2014

Love it!

@jdantonio
Copy link
Member

At first glance there seem to be some great ideas here. Unfortuntely I only have my iPad with me and won't be able to pull the code until I return home. I apologize for not being able to take a more thorough look right now.

@pitr-ch
Copy link
Member Author

pitr-ch commented Nov 4, 2014

another set of improvements

  • untangle callback into methods for better readability
  • promise tree starting with delay can be now also triggered by calling
    wait or value on any of the children
  • better inspect-ability
  • track what blocks what
  • better to_s and inspect support

@coveralls
Copy link

Coverage Status

Coverage remained the same when pulling 16783f3 on pitr-ch:next into 9dc888d on ruby-concurrency:master.

@pitr-ch
Copy link
Member Author

pitr-ch commented Nov 4, 2014

Latest commit allows to chain the futures into acyclic graph (branches are created with Future#then and similar, joined with Future.join) where any node can be made lazy evaluated. Examples:

# if head of the tree is not constructed with #future but with #delay it does not start execute,
# it's triggered later by calling wait or value on any of the dependent futures or the delay 
# itself
three = (head = delay { 1 }).then { |v| v.succ }.then(&:succ)
four  = three.then_delay(&:succ)

# evaluates only up to three, four is left unevaluated
p three.value # 3
p four.complete? # false
# until #value is called on four
p four.value # 4

# futures hidden behind two delays trigger evaluation of both
double_delay = delay { 1 }.then_delay(&:succ)
p double_delay.value # 2
head    = future { 1 }
branch1 = head.then(&:succ).then(&:succ)
branch2 = head.then(&:succ).then_delay(&:succ)
result  = Future.join(branch1, branch2).then { |b1, b2| b1 + b2 }

sleep 0.1
p branch1.completed?, branch2.completed? # true, false
# force evaluation of whole graph
p result.value # 6

@pitr-ch
Copy link
Member Author

pitr-ch commented Nov 5, 2014

There are usage examples at the end of the file, I forgot to mention that. Thanks @iNecas.

@pitr-ch
Copy link
Member Author

pitr-ch commented Nov 7, 2014

It'll also work nicely with actors.

actor.ask(:msg).then { |result| p result } # no blocking

@coveralls
Copy link

Coverage Status

Coverage decreased (-3.73%) when pulling 8af772c on pitr-ch:next into 9dc888d on ruby-concurrency:master.

require 'jruby'

# roughly more than 2x faster
class JavaSynchronizedObject
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

👍

I really like this new synchronized object. Being able to wait on a mutex is a great feature, and I always like it when we can optimize on JRuby.

@brasten
Copy link

brasten commented Nov 11, 2014

Has there been any discussion around chaining futures where one of the futures' value might be a future itself? Contrived example:

future { future { 1 } }.then { |v| v + 3 }.value

Where Future#then is analogous to Scala's Future.map, something like a Future#flat_then could be analogous to Scala's Future.flatMap.

Of course, duplicating the various chaining methods with #flat_* versions seems somewhat messy. Perhaps a Future#flatten which would operate kind of like Future.join, but would join together (conceptually) a future and it's value (if the value ends up resolving to another future).

# Example
future { future { 1 } }.flatten.then { |v| v + 1 }.value

Thoughts?

@obrok
Copy link
Contributor

obrok commented Nov 12, 2014

I'm all for flat_map on Futures, without it it's just awkward when you want to combine two things that are futures. What other flat_* method would you need?

@pitr-ch
Copy link
Member Author

pitr-ch commented Nov 12, 2014

@brasten good point flatMap is missing. I like future { future { 1 } }.flatten.then { |v| v + 1 }.value. Obvious way how to do it now is future { future { 1 } }.value.then { |v| v + 1 }.value but that is blocking, so no good. I'll find a nonblocking solution with callbacks. How to deal with situations like future { raise }.flatten needs to be figured out, flat_* could help.
BTW: I've found https://github.com/brasten/rama if you have any concurrent-ruby questions I'll be happy to help on github issues or on https://gitter.im/ruby-concurrency/concurrent-ruby.

@obrok I think @brasten meant: id map is then, therefore flat_map is flat_then so there would be flat_rescue, flat_chain missing. I think Scala's flatMap is actually equal to flat_chain in this considered implementation.

Hm, so to sum it up and to make the api consistent I think I add flat_* methods or I'll explore if future { future { 1 } }.flat.then { |v| v + 1 }.value would be possible, doing the same for delay to get things like future { future { 1 } }.flat.delay.then { |v| v + 1 }.value.

@obrok
Copy link
Contributor

obrok commented Nov 12, 2014

Hm, I'm not sure #flatten is what one wants. From my experience it's not usually the case that you have a Future<Future<T>> and want to transform it into a Future<T>. Instead it's usually that you have a future : Future<A> and a function f(a: A) : Future<B> and want to combine them into a Future<B>. Of course you can do it with future.then { |v| f(v) }.flatten, but I prefer future.flat_map { |v| f(v) }.

That said, they can maybe coexist?

@obrok
Copy link
Contributor

obrok commented Nov 12, 2014

IIRC bangs aren't supposed to signify destructive but alternative, so then!
should be OK
On Nov 12, 2014 6:48 PM, "Brasten Sager" [email protected] wrote:

@obrok https://github.com/obrok -- Fair enough point; ending up with a
Future<Future> sometimes just means you used map when you should have
used flatMap.

We might also consider bang methods instead of flat_chain / flat_map
methods. Since we don't overload collection concepts for this like Scala,
"flat_" (and "flat_map" especially) may not be the best fit. Maybe future.then!
{ |v| f(v) } and the like. It requires some philosophical gymnastics
around what constitutes a "destructive method" of course.

I have no strong opinion either way -- implementer's prerogative. :-)


Reply to this email directly or view it on GitHub
#176 (comment)
.

@pitr-ch
Copy link
Member Author

pitr-ch commented Nov 12, 2014

@obrok ah I see, thanks. I've looked at Scala Future API and the names are driven a lot by Scala's for comprehension and the way how it unrolls. I'll try to push something later.

- untangle callback into methods for better readability
- promise tree starting with delay can be now also triggered by calling
  wait or value on any of the children
- better inspectability
- track what blocks what
- better to_s and inspect support
- Adds #chain_delay, #then_delay, #rescue_delay which are same as #chain,
  #then, #rescue but are not evaluated automatically but only when requested
  by #value.
- Restructure class hierarchy. Only one Future with Multiple Promise
  implementations which are hidden to the user. Provides better encapsulation.
- Delay is now implemented as a Promise descendant.
adding Scala's flatMap
supports all, any, flattening
@pitr-ch
Copy link
Member Author

pitr-ch commented Dec 5, 2014

Hi, flat support is in.

f = ConcurrentNext.future { ConcurrentNext.future { 1 } }.flat.then(&:succ)
expect(f.value).to eq 2

which corresponds with schedule and delay

ConcurrentNext.
    future { sleep 0.1; 1 }.
    schedule(0.1).
    then { |v| v + 1 }.
    value # returns 2 in 0.2 sec

f = ConcurrentNext.
    future { 1 }.
    then(&:succ). # evaluates up to here
    delay.
    then(&:succ) # this is lazy evaluated when firstly requested by #value
f.value # returns 3

.all and .any is also there:

f1 = ConcurrentNext.future {1}
f2 = ConcurrentNext.future {2}
ConcurrentNext.all(f1, f2).then { |v1, v2| v1+v2 }.value # => 3
ConcurrentNext.any(f1, f2).value # => 1 or 2 depends on which one finishes first

@jdantonio
Copy link
Member

I've suggested in #139 that we close this PR and release 1.0 with the existing APIs. Please continue the discussion there.

@pitr-ch pitr-ch added the enhancement Adding features, adding tests, improving documentation. label Mar 23, 2015
@pitr-ch pitr-ch added this to the 0.9.0 Release milestone Apr 9, 2015
@pitr-ch
Copy link
Member Author

pitr-ch commented Apr 22, 2015

merged to #274

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement Adding features, adding tests, improving documentation.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

7 participants