admin管理员组

文章数量:1406063

Here's my code:

let padded = "03";
ascii = `\u00${padded}`;

However, I receive Bad character escape sequence from Babel. I'm trying to end up with:

\u0003

in the ascii variable. What am I doing wrong?

EDIT:

Ended up with ascii = (eval('"\\u00' + padded + '"'));

Here's my code:

let padded = "03";
ascii = `\u00${padded}`;

However, I receive Bad character escape sequence from Babel. I'm trying to end up with:

\u0003

in the ascii variable. What am I doing wrong?

EDIT:

Ended up with ascii = (eval('"\\u00' + padded + '"'));

Share Improve this question edited Oct 20, 2016 at 14:22 Bergi 667k161 gold badges1k silver badges1.5k bronze badges asked Nov 23, 2015 at 14:53 benhowdle89benhowdle89 37.5k74 gold badges207 silver badges340 bronze badges 2
  • 1 Uh? The eval solution yields the same result asString.fromCodePoint... – Felix Kling Commented Nov 23, 2015 at 15:58
  • Don't provide your own answer in the question. If you think you have the answer, then post it as an answer. – user663031 Commented Nov 23, 2015 at 16:03
Add a ment  | 

1 Answer 1

Reset to default 5

What am I doing wrong?

A unicode escape sequence is basically atomic. You cannot really build one dynamically. Template literals basically perform string concatenation, so your code is equivalent to

'\00' + padded

It should be obvious now why you get that error. If you want to get the corresponding unicode character you can instead use String.fromCodePoint or String.fromCharCode:

String.fromCodePoint(3)

If you want a string that literally contains the character sequence \u0003, then you just need to escape the escape character to produce a literal backslash:

`\\u00${padded}`

本文标签: javascriptES6 Bad character escape sequence creating ASCII stringStack Overflow