CGEventKeyboardSetUnicodeString cannot be called; incorrect metadata?

Issue #64 resolved
Andrew Barnert
created an issue

The way Quartz.CGEventKeyboardSetUnicodeString is defined, it's effectively impossible to call.

There are various other functions that take UniChar* parameters that may be affected; I haven't tested them.

In PyObjC 2.4.x, you had to explicitly cast your string to a sequence of ints from 0-65535 (and of course encode to UTF-16, directly or via NSString/CFString), but then it worked. For example:

s16 = Foundation.NSString.stringWithString_(s)
Quartz. CGEventKeyboardSetUnicodeString(event, len(s16), map(ord, s16))

In 2.5.1, you still can't pass a bytes or unicode, and you also can't pass a sequence of ints, or a ctypes pointer/array/buffer of char or wchar_t, or a sequence of bytes or unicodes, or anything else I can think of. (Actually, one of the ctypes combinations seemed to work in Python 2.7, but not in Python 3.3.)

I believe the problem was losing the _C_IN modifier in the metadata—the type is now ^T instead of n^S, and if I edit to add the following, it works with the same workarounds as in 2.4:

{'arguments': {2: {'type_modifier': 'n'}}}

Comments (3)

  1. Log in to comment