-
Notifications
You must be signed in to change notification settings - Fork 240
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[DRY?] Direct definition vs. generic combinators, ctd. in Algebra.Definitions.RawMonoid
#2475
Comments
Ghastly. But such is the fate of the functional representation. So much right-programming, just when I'd gotten used to the beauty of left-programming! I wish there were a better symbol than |
Not quite sure what "Ghastly" is referring to... but:
The (implied) criticism of existing choices in |
Second thoughts on the refactoring: cf. the twist on Russell's "the advantages of theft over honest toil" being that there's always a choice between making a definition (and getting the property 'for free', except that you have to show equivalence with the 'old'/'standard' one), and sticking with the 'old' definition and proving a property; in this case one way out of the bottle is simply to prove, in
(UPDATED: needed to rethink what I had previously written) |
"Ghastly" refers to having to pattern-match on implicits because the (functional) representation doesn't let you make any kind of split. It is inevitable. Programming with Now that I'm looking back at these: I wonder if they are all too strict (especially the new Again on the name |
I see! (that said, pattern-matching on a type which doesn't (easily) admit the ones you want: use views!?) And as for the "anti-pattern", it's perhaps instructive to revisit McCarthy&Painter (1967): 'abstract syntax' is (originally?) defined there in terms of destructors, because that doesn't (necessarily?) commit the implementor to a(n inductive) tree representation... cf Atanassow frequently cited previously on this topic. |
I'm all for splitting of programming in to concrete programming (providing useful functionality for a particular representation) and abstract programming (i.e. programming to an interface). What kind of functions that interface gives you (builders or observations or a combination of both) is a different thing. |
Addition: suggest adding to
Data.Vec.Functional.Base
(and perhaps also retrofitting to
Data.Vec.Base
, too)Refactoring: suggest replacing the definitions in
Algebra.Definitions.RawMonoid
with(with the last one for backwards compatibility; but even for right-associative addition,
sumʳ
avoids redundancy in the{n = 1}
case)and then also
(again: using
sumʳ
in the first definition might be an improvement? although abreaking
change...)BUT as to performance/intensional reduction behaviour:
_×_
and_×′_
in this way, even with{-# INLINE ... #-}
directives, just doesn't quite give the correct unfolding behaviour for the existing proofs inAlgebra.Properties.Monoid.Mult.*
to go through properly;stdlib
seems to be quite sparing in its appeals to such 'extralogical' techniques)It seems as though the 'generic' definitions ought to be more robust/reusable/'better' (esp. as
Vector.replicate
is simplyconst
, so the use ofhead
andtail
in the various folds is inlinable/eliminable?), but it just ... doesn't quite seem to work :-(Any help/insight into how to (understand how to) resolve this trade-off welcome! It feels as though I am missing something fundamental, but possibly non-obvious, about what's going on here.
The text was updated successfully, but these errors were encountered: