Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
1.1k views
in Technique[技术] by (71.8m points)

c# - Using SendInput to send unicode characters beyond U+FFFF

I'm writing an onscreen keyboard similar to the one in Windows 8. I have no problem sending most of the characters I need using Win32's SendInput.

The problem is when it comes to the new Windows 8 Emoji's. They start at U+1F600 using the Segoe UI Symbol font.

Using Spy++ on the Windows 8 onscreen keyboard I get the following output for all Emoji glyphs.

<00001> 000C064A P WM_KEYDOWN nVirtKey:VK_PACKET cRepeat:1 ScanCode:00 fExtended:0 fAltDown:0 fRepeat:0 fUp:0
<00002> 000C064A P WM_CHAR chCharCode:'63' (63) cRepeat:1 ScanCode:00 fExtended:0 fAltDown:0 fRepeat:0 fUp:0
<00003> 000C064A P WM_KEYUP nVirtKey:VK_PACKET cRepeat:1 ScanCode:00 fExtended:0 fAltDown:0 fRepeat:1 fUp:1
<00004> 000C064A P WM_KEYDOWN nVirtKey:VK_PACKET cRepeat:1 ScanCode:00 fExtended:0 fAltDown:0 fRepeat:0 fUp:0
<00005> 000C064A P WM_CHAR chCharCode:'63' (63) cRepeat:1 ScanCode:00 fExtended:0 fAltDown:0 fRepeat:0 fUp:0
<00006> 000C064A P WM_KEYUP nVirtKey:VK_PACKET cRepeat:1 ScanCode:00 fExtended:0 fAltDown:0 fRepeat:1 fUp:1

Since they all produce the same output I can't see what is sent that actually identifies the unique glyph.

I'm aware that SendInput has a KEYEVENTF_UNICODE parameter for sending unicode characters. But these characters seem to be in some sort of extended unicode page. Beyond the 16-bit unicode range (U+0000 to U+FFFF) a C# char or the short wScan value in the INPUT struct can represent.

Here is my SendCharUnicode method.

public static void SendCharUnicode(char ch)
{
    Win32.INPUT[] input = new Win32.INPUT[2];

    input[0] = new Win32.INPUT();
    input[0].type = Win32.INPUT_KEYBOARD;
    input[0].ki.wVk = 0;
    input[0].ki.wScan = (short)ch;
    input[0].ki.time = 0;
    input[0].ki.dwFlags = Win32.KEYEVENTF_UNICODE;
    input[0].ki.dwExtraInfo = Win32.GetMessageExtraInfo();

    input[1] = new Win32.INPUT();
    input[1].type = Win32.INPUT_KEYBOARD;
    input[1].ki.wVk = 0;
    input[1].ki.wScan = (short)ch;
    input[1].ki.time = 0;
    input[1].ki.dwFlags = Win32.KEYEVENTF_UNICODE | Win32.KEYEVENTF_KEYUP;
    input[1].ki.dwExtraInfo = Win32.GetMessageExtraInfo();

    Win32.SendInput(2, input, Marshal.SizeOf(typeof(Win32.INPUT)));
}

How can I get this method modified to successfully send a Unicode character such as ?? (U+1F600)?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

I have used API Monitor on the Windows 8 onscreen keyboard and it does indeed use SendInput. After further investigation I have discovered you need to break the UTF-32 Unicode character down into its UTF-16 surrogate pair Eg. ?? U+1F604 becomes [U+D83D U+DE04]. So if I send D83D then DE04 I can successfully send U+1F604.

Here is a working method:

public static void SendCharUnicode(int utf32)
{
    string unicodeString = Char.ConvertFromUtf32(utf32);
    Win32.INPUT[] input = new Win32.INPUT[unicodeString.Length];

    for (int i = 0; i < input.Length; i++)
    {
        input[i] = new Win32.INPUT();
        input[i].type = Win32.INPUT_KEYBOARD;
        input[i].ki.wVk = 0;
        input[i].ki.wScan = (short)unicodeString[i];
        input[i].ki.time = 0;
        input[i].ki.dwFlags = Win32.KEYEVENTF_UNICODE;
        input[i].ki.dwExtraInfo = IntPtr.Zero;
    }

    Win32.SendInput((uint)input.Length, input, Marshal.SizeOf(typeof(Win32.INPUT)));
}

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...