-
Notifications
You must be signed in to change notification settings - Fork 53
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
FFI API Issue on Windows - Strings and Unicode? #916
Comments
Hi @dcvz, nice to meet you!
Isn't W |
Nice to meet you @dcharkes! Great stuff here in ffi! I have also tried to WChar to no avail, although I don’t have a branch to play with for that. Should I share one? Any tips on debugging what ffi does internally to check out the pointer contents? |
I have not completely figured things out but using Utf16 gets me further than before. It seems there's some difference between Also, I only need this type on windows. Is there compile time defines? So I can set for example the TCHAR typedef to one thing on windows and another on everything else? |
https://api.dart.dev/stable/2.16.0/dart-ffi/AbiSpecificInteger-class.html
Yes, look at the definitions for both types: /// The C `wchar_t` type.
///
/// The signedness of `wchar_t` is undefined in C. Here, it is exposed as the
/// defaults on the tested [Abi]s.
///
/// The [WChar] type is a native type, and should not be constructed in
/// Dart code.
/// It occurs only in native type signatures and as annotation on [Struct] and
/// [Union] fields.
@Since('2.17')
@AbiSpecificIntegerMapping({
Abi.androidArm: Uint32(),
Abi.androidArm64: Uint32(),
Abi.androidIA32: Uint32(),
Abi.androidX64: Uint32(),
Abi.fuchsiaArm64: Uint32(),
Abi.fuchsiaX64: Int32(),
Abi.iosArm: Int32(),
Abi.iosArm64: Int32(),
Abi.iosX64: Int32(),
Abi.linuxArm: Uint32(),
Abi.linuxArm64: Uint32(),
Abi.linuxIA32: Int32(),
Abi.linuxX64: Int32(),
Abi.linuxRiscv32: Int32(),
Abi.linuxRiscv64: Int32(),
Abi.macosArm64: Int32(),
Abi.macosX64: Int32(),
Abi.windowsArm64: Uint16(),
Abi.windowsIA32: Uint16(),
Abi.windowsX64: Uint16(),
})
class WChar extends AbiSpecificInteger {
const WChar();
} WChar is defined as an integer with differing sizes on different platforms. Utf16 is defined as an opaque type ( |
Hello team,
Excuse the confusing title, but this is an issue that I'm not quite sure where it stems from and just have guesses. I'm working on migrating flutter_storm, a bridge for StormLib, from using native bindings to FFI. However I'm running into a an issue (only on Windows).
If you try the example project in the
flutter_storm
and try to open a test archive (Test.mpq.zip), it'll fail on Windows. After some research I found a similar issue on their issue tracker: ladislav-zezula/StormLib#260 which ended up being an issue between usingLPStr
andLPWStr
which led me to explore how FFI is doing string conversions.ffigen
generated uses ofPointer<ffi.Char>
for me -- I've also tried using Pointer but cannot get it to succeed on Windows. I unfortunately also cannot figure out how to debug ffi conversions and internals. Would love for some advicement or help here. From dart I'm converting to native withtoNativeUtf8().cast<Char>()
.Here's the branch of
flutter_storm
where I'm doing the migration to ffi: https://github.com/HarbourMasters64/flutter_storm/tree/feature/ffiHere's the branch where I've attempted to use
Pointer<Utf8>
to no avail:https://github.com/HarbourMasters64/flutter_storm/tree/feature/ffi-utf8
The text was updated successfully, but these errors were encountered: