Javascript vs Java - 字符串编码

Javascript vs Java - String encoding

本文关键字:字符串 编码 Java vs Javascript      更新时间:2023-09-26

我正在尝试将以下内容从Java移植到JavaScript:

String key1 = "whatever";
String otherKey = "blah";
String key2;    
byte keyBytes[] = key1.getBytes();
for (int i = 0; i < keyBytes.length; i++) {
    keyBytes[i] ^= otherKey.charAt(i % otherKey.length());
}
key2 = new String(keyBytes);

这是我写的:

var key1 = "whatever";
var other_key = "blah";
var key2 = "";
for (var i = 0; i < key1.length; ++i)
{
    var ch = key1.charCodeAt(i);
    ch ^= other_key.charAt(i % other_key.length);
    key2 += String.fromCharCode(ch);
}

但是,他们给出了不同的答案。

有什么问题,JavaScript 字符串的编码方式是否不同,我该如何纠正它们?

你忘记了代码上的一个 charCodeAt(),如下所示:

var key1 = "whatever";
var other_key = "blah";
var key2 = "";
for (var i = 0; i < key1.length; ++i)
{
    var ch = key1.charCodeAt(i);
    ch ^= other_key.charAt(i % other_key.length).charCodeAt(0);
    key2 += String.fromCharCode(ch);
}
在java

中,在你做^=之前有一个隐式强制转换(字符到字节)操作

我更改了代码以查看java和javascript中的字节数组。运行后结果相同:

Javascript:

function convert(){
    var key1 = "whatever";
    var other_key = "blah";
    var key2 = "";
    var byteArray = new Array();
    for (var i = 0; i < key1.length; ++i){
       var ch = key1.charCodeAt(i);
       ch ^= other_key.charAt(i % other_key.length).charCodeAt(0);
       byteArray.push(ch);
       key2 += String.fromCharCode(ch);
    }
    alert(byteArray);
}

结果 : 21,4,0,28,7,26,4,26


爪哇岛:

static void convert() {
    String key1 = "whatever";
    String otherKey = "blah";
    String key2;
    byte keyBytes[] = key1.getBytes();
    for (int i = 0; i < keyBytes.length; i++) {
        keyBytes[i] ^= otherKey.charAt(i % otherKey.length());
    }
    System.out.println(Arrays.toString(keyBytes));
    key2 = new String(keyBytes);
}

结果: [21, 4, 0, 28, 7, 26, 4, 26]