The flag was dropped as part of Demand Analyser
Signed-off-by: Sergei Trofimovich <email@example.com>
I'm not sure, see this thread: https://mail.haskell.org/pipermail/ghc-devs/2015-October/010067.html
The -fmax-worker-args flag is supposed to stop GHC unpacking a function with a zillion fields. E.g. suppose we have f :: (Int,Int,Int, ..., Int, Int) -> Int which, say, adds up all the field of a 200-tuple. It would probably gain little to worker/wrapper this, because the worker would get 200 arguments. Maybe that's ok, but the gain over allocating the arg tuple in the heap is proportionately less. But clearly some refactoring lost this ability. Maybe someone should put it back!
Christiaan says (https://mail.haskell.org/pipermail/ghc-devs/2015-October/010096.html):
> I’ll also see about reviving -fmax-worker-args. > Although it’s not used by any package on Hackage, and it has been dead for over 2 years now I think.
I (sadly) don't see myself working on reinstating -fmax-worker-args. However, given that it was unintentionally rendered a no-op during improvement of the demand analysis phase, I would think removing it entirely is a mistake. Instead, I think it would be better if GHC issued a warning that -fmax-worker-args is currently a no-op.
I'd rather we not have options that say "this was broken and we don't have an intention of fixing it now", if possible. So, I'd suggest we either A) remove it and file a bug to revive it (maybe with high priority), or B) actually fix it. Or, C) don't really say "this flag was broken and doesn't work", but just kind of lie and say "this flag has no effect" and file a ticket. I suppose C) is the easiest path (but of course, probably nobody will ever fix it, which is my worry!)
Either way, this seems undecided, so I'm punting on this currently.