I think this is a usability bug in darcs apply:
Consider the case where we have a repository called “central”, and a
local copy called “laggard”. If somebody sends a patch against
“central”, the person applying said patch to “laggard” will sometimes be
confused when darcs refuses to apply the bundle because of missing
dependencies. The confusion comes from the violation of the expectation
set by darcs pull and friends that unwanted non-dependencies can be
cherry picked. How come apply doesn't do cherry-picking?
The answer is not patch theory related. It's just an artefact of how
darcs send/apply works: no dependencies in bundles.
I think we should maybe think of some ways we can improve on this.
* minimal context
* store metadata in bundles giving original repo, and add UI like “fetch
the missing patches from http://example.com?”
I encountered this same problem two nights ago after implementing a
rudimentary rebase command in terms of obliterate and apply. I had
patches ABCD and called 'obliterate -o' to create a patch file for each
one. I then tried to apply them in the order ACDB. Of course, the bundle
for D referred to B in the context and 'apply' refused to proceed.
Before this experience, I falsely assumed that context for a bundle was
calculated by commuting patches backwards until any part of them changed.
Is that what kowey means by minimal context?
In this particular case, storing extra metadata and fetching false
dependencies during apply would break the intent of rebase.