admin管理员组文章数量:1406063
Here's my code:
let padded = "03";
ascii = `\u00${padded}`;
However, I receive Bad character escape sequence
from Babel. I'm trying to end up with:
\u0003
in the ascii
variable. What am I doing wrong?
EDIT:
Ended up with ascii = (eval('"\\u00' + padded + '"'));
Here's my code:
let padded = "03";
ascii = `\u00${padded}`;
However, I receive Bad character escape sequence
from Babel. I'm trying to end up with:
\u0003
in the ascii
variable. What am I doing wrong?
EDIT:
Ended up with ascii = (eval('"\\u00' + padded + '"'));
-
1
Uh? The
eval
solution yields the same result asString.fromCodePoint
... – Felix Kling Commented Nov 23, 2015 at 15:58 - Don't provide your own answer in the question. If you think you have the answer, then post it as an answer. – user663031 Commented Nov 23, 2015 at 16:03
1 Answer
Reset to default 5What am I doing wrong?
A unicode escape sequence is basically atomic. You cannot really build one dynamically. Template literals basically perform string concatenation, so your code is equivalent to
'\00' + padded
It should be obvious now why you get that error. If you want to get the corresponding unicode character you can instead use String.fromCodePoint
or String.fromCharCode
:
String.fromCodePoint(3)
If you want a string that literally contains the character sequence \u0003
, then you just need to escape the escape character to produce a literal backslash:
`\\u00${padded}`
本文标签: javascriptES6 Bad character escape sequence creating ASCII stringStack Overflow
版权声明:本文标题:javascript - ES6: Bad character escape sequence creating ASCII string - Stack Overflow 内容由网友自发贡献,该文观点仅代表作者本人, 转载请联系作者并注明出处:http://www.betaflare.com/web/1744958133a2634483.html, 本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌抄袭侵权/违法违规的内容,一经查实,本站将立刻删除。
发表评论