In an app we are calculating a SHA1Hmac in java using the following:
SecretKey key = new SecretKeySpec(secret, "HmacSHA1");
Mac m = Mac.getInstance("HmacSHA1");
m.init(key);
byte[] hmac = m.doFinal(data);
And later, the hmac is verified in C# – on a SmartCard – using:
HMACSHA1 hmacSha = new HMACSHA1(secret);
hmacSha.Initialize();
byte[] hmac = hmacSha.ComputeHash(data);
However, the result is not the same. Did I overlook something important?
The inputs seem to be the same. Here some sample inputs:
Data: 546573746461746131323341fa3c35
Key: 6d795472616e73616374696f6e536563726574
Result Java: 37dbde318b5e88acbd846775e38b08fe4d15dac6
Result C#: dd626b0be6ae78b09352a0e39f4d0e30bb3f8eb9
I wouldn’t mind to implement my own hmacsha1 on both platforms, but using what already exists….
Thanks!
With this Java code:
then I get the
dd626b0be6ae78b09352a0e39f4d0e30bb3f8eb9which is what you have from your C# implementation. Also, I have verified that value with regards to my own HMAC and SHA-1 implementation (in Java) and I also get that result.It seems that your Java code is flawed, but not in the part you show (except your
m.init(secret)which does not compile — it has to bem.init(key)). As my code shows, the Java implementation of HMAC/SHA-1 is correct and you invoke it properly. My guess is that you are not inputting the right data or key.(I am using Sun’s JDK 1.6.0_16)