Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do I encode Unicode character codes in a PowerShell string literal?

How can I encode the Unicode character U+0048 (H), say, in a PowerShell string?

In C# I would just do this: "\u0048", but that doesn't appear to work in PowerShell.

like image 662
dan-gph Avatar asked Jun 29 '09 05:06

dan-gph


People also ask

Can PowerShell display Unicode?

Windows supports Unicode and traditional character sets. Traditional character sets, such as Windows code pages, use 8-bit values or combinations of 8-bit values to represent the characters used in a specific language or geographical region settings. PowerShell uses a Unicode character set by default.

How do I encode a Unicode?

Unicode uses two encoding forms: 8-bit and 16-bit, based on the data type of the data that is being that is being encoded. The default encoding form is 16-bit, where each character is 16 bits (2 bytes) wide. Sixteen-bit encoding form is usually shown as U+hhhh, where hhhh is the hexadecimal code point of the character.

What are Unicode literals?

If the character string literal has a prefix of N, the literal is treated as a Unicode string. When the N prefix is used, the characters in the literal are read as WCHAR characters. Any string literal with non-ASCII characters is treated as a Unicode literal by default.


1 Answers

Replace '\u' with '0x' and cast it to System.Char:

PS > [char]0x0048 H 

You can also use the "$()" syntax to embed a Unicode character into a string:

PS > "Acme$([char]0x2122) Company" AcmeT Company 

Where T is PowerShell's representation of the character for non-registered trademarks.

like image 164
Shay Levy Avatar answered Sep 19 '22 07:09

Shay Levy