As a result of the encoding changes in Darcs 2.14, which makes
encoding behaviour much better on Linux, there have been some
regressions in encoding behaviour on Windows (which probably didn't
work that well anyway before). See patch1628 for some comments.
This is different to issue2590, which describes encoding problems on
Windows which aren't a regression.
Looking at your patch1683, it occurred to me that the Windows
implementation of decode probably fails unless the input ByteString is
in a valid unicode encoding, which excludes 8-bit encodings such as the
varuious ISO latin variants.
What exactly is build failure you fixed with patch1683?
I see (now) that GHC.IO.Encoding warns about portability issues. In
particular, it tells us that the //ROUNDTRIP encoding fails to work if
the input ByteString is in UTF-16, which seems to be the case on
(modern) Windows based on the code in src/Darcs/Util/Encoding/Win32.hs
So the code there is incorrect if the user reconfigures the system to
use an 8-bit encoding. On the other hand, the code based on
GHC.IO.Encoding should work in this case. So it would be interesting to
know why it doesn't compile.
Somewhat belatedly, I'm attaching the build failure I get if I rollback
patch1683. (Surprisingly it's still undepended-on, except by tags)
I think the code it was fixing was just broken - with WIN32 defined,
encode/decode were only being imported qualified, and
encodeUtf8/decodeUtf8 weren't defined at all, yet all were exported.