Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to output emoji to console in Node.js (on Windows)?

On Windows, there's some basic emoji support in the console, so that I can get a monochrome glyph if I type, e.g. β˜• or πŸ“œ. I can output a string from PowerShell or a C# console application or Python and they all show those characters fine enough.

However, from Node.js, I can only get a couple of emoji to display (e.g. β˜•), but not other (instead of πŸ“œ I see οΏ½). However, if I throw a string with those characters, they display correctly.

console.log(' πŸ“œ β˜• ');
throw ' πŸ“œ β˜• ';

If I run the above script, the output is

 οΏ½ β˜•

C:\Code\emojitest\emojitest.js:2
throw ' πŸ“œ β˜• '; 
^
 πŸ“œ β˜•

Is there anyway that I can output those emojis correctly without throwing an error? Or is that exception happening outside of what's available to me through the standard Node.js APIs?

like image 330
bdukes Avatar asked May 18 '17 18:05

bdukes


People also ask

How do you console an object in node JS?

Unit Testing and Test Driven Development in NodeJS Node. js console is a global object and is used to print different levels of messages to stdout and stderr. There are built-in methods to be used for printing informational, warning, and error messages.

How do you use Emojis in Javascript?

To specify this emoji in HTML using the codepoint, we have to modify the value a bit. Add the &#x characters, remove the U+1 from the beginning of the codepoint, and just add the remaining digits from the codepoint as part of any text element.


1 Answers

What you want may not be possible without a change to libuv. When you (or the console) write to stdout or stderr on Windows and the stream is a TTY, libuv does its own conversion from UTF‑8 to UTF‑16. In doing so it explicitly refuses to output surrogate pairs, emitting instead the replacement character U+FFFD οΏ½ for any codepoint beyond the BMP.

Here’s the culprit in uv/src/win/tty.c:

  /* We wouldn't mind emitting utf-16 surrogate pairs. Too bad, the */
  /* windows console doesn't really support UTF-16, so just emit the */
  /* replacement character. */
  if (utf8_codepoint > 0xffff) {
    utf8_codepoint = UNICODE_REPLACEMENT_CHARACTER;
  }

The throw message appears correctly because Node lets Windows do the conversion from UTF‑8 to UTF‑16 with MultiByteToWideChar() (which does emit surrogate pairs) before writing the message to the console. (See PrintErrorString() in src/node.cc.)

Note: A pull request has been submitted to resolve this issue.

like image 77
Brian Nixon Avatar answered Sep 18 '22 04:09

Brian Nixon