[Standards] XEP-231 Bits of Binary
dave at cridland.net
Sat May 31 19:20:24 UTC 2014
On 31 May 2014 19:28, Christian Schudt <christian.schudt at gmx.de> wrote:
> I am currently reading and implementing XEP-231 Bits of Binary and
> stumbled upon a possible bug:
> I noticed, that the SHA-1 hash "8f35fef110ffc5df08d579a50083ff9308fb6242",
> which is used in the examples,
> is the hash of the used Base64 string ("iVBORw0KGgoAAA…..").
> The description of the "cid" attribute says:
> the "hash" is the hex output of the algorithm applied to the binary data
> Do we need to calculate the hash from the Base64 String or from the
> underlying binary data?
If the examples disagree with the text, the text is the one to follow.
It's possible, though, that the deployed base has chosen to follow the
examples instead of the text, so it's well worth asking here and on jdev@;
however not in this case.
In this case, the cid scheme URI's path is designated a SHOULD, not a MUST.
This is because in practise, a common URI generation algorithm only matters
for caching purposes - that is, it only matters if two independent entities
use the same cid for different BoBs. Using two different cids for the same
BoB is less efficient, but won't cause an interoperability problem (hence
the SHOULD rather than MUST).
> i.e. the examples have used something like (in Java code):
> sha1 ( base64String.getBytes() )
> and the description suggests something like:
> sha1 ( Base64.decode(base64String) ) or just directly using sha1(
> byteArray )
So the latter is correct. But you cannot rely on it, and shouldn't be
testing for it in a URI from someone else.
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the Standards