-
Notifications
You must be signed in to change notification settings - Fork 2.9k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Can't use client API on OS X #556
Comments
Yeah, that thing is causing nothing but problems. It seems this is a waf deficiency. But even without it, OSX wouldn't work too well, because of strange cocoa issues. We haven't figured out yet how to use cocoa with the client API. It needs work. |
Any way I can help with Cocoa? |
The problem is how to integrate the cocoa video output window (implemented by On win32 and X11, embedding the VO window is pretty simple: the host application has to provide a parent window handle, and mpv will just create the VO window as sub window. This happens with the For cocoa, this is not so simple. I don't know too much about cocoa, but the problem starts with the fact that cocoa requires the use of a NSApp singleton. It looks like the host application will usually create this singleton (if that application is a GUI), so mpv obviously can't create it. We probably need to refactor the mpv cocoa code a bit, so that mpv can provide some sort of embeddable cocoa widget. So, basically, we have 2 problems:
Note that smplayer on OSX doesn't embed anything. Instead, the video data is transferred over shared memory, and rendered by smplayer itself. We don't really want that. |
Ah, embedding into Cocoa views is going to be fun. Neat. I'm going to poke around the .m files for a bit and see what you are doing currently. Somewhat related to (1), I'm actually looking to render the video as a textured quad in an OpenGL scene rather than putting it in a Cocoa window; how could that be reconciled with client API embedding? I see mpv already has GL render drivers, but I'm guessing that exposing them to the client API would be a messy task...? |
This would probably also be possible. I think using the same GL context for application and mpv would be too messy, but one could use a shared GL context. Which means mpv creates its own GL context, but so that GL objects are explicitly shared with the GL context of the application. Another possibility would be using IOSurfaces to exchange image data between mpv and application. In both cases, mpv would pass rendered images with timestamps to the client API user. Handling initialization and passing of image data would probably require extra APIs. |
Oh, interesting! IOSurfaces sound like a great way to handle both Cocoa and raw GL cases. It would confine Mac embedders to only use the GL driver though. Hmm. |
Currently, all GUI VOs use OpenGL on OSX anyway. X11 still works on OSX, but I doubt anyone actually wants to use this. |
@pigoz Neat! Thanks. |
We could need some help with this. |
Still deadlocking? I can give it a shot. I tried looking at it with Instruments a month or two ago but didn't see anything that stood out. I'll give it a go again and see what I can find. |
Well, the main problem is that cocoa requires the application to create a global NSApplication object, and this object must be created in the main thread (that is, the thread that is created first by the OS, and in which the C main() function runs). Since libmpv is supposed to be embedded into other GUIs (no matter whether they're native cocoa, or something like Qt). The mpv cocoa parts that are used by vo_opengl also depend on such a NSApplication object. Currently, mpv probably deadlocks because the NSApplication object is explicitly not created when starting mpv with libmpv (mpv_create()), and it waits for a condition that never becomes true, or something like this. This probably can't be easily solved, and will probably require some more complicated interaction between the host application and libmpv. |
Oh, and maybe I didn't make this clear: the host application creates the NSApplication object. Now how can libmpv make use of it, and how can it create and embed its own cocoa widget or whatever is needed to create a GL context and to display video? |
In the libmpv case, MPV can just obtain the singleton with |
I see two general approaches to this (API-wise):
I think 1. would be better for Cocoa (if possible). 2. would be more generic and could be reused on other operating systems, but would also require for more code and interaction from the application. |
Well, 2. is what I want to use, and I think most people will really want to use for the sake of being cross platform. 1. is a bridge that can be crossed when someone wants it. What strategy does the included mpv OS X app bundle take, anyway? |
So I'm looking at writing some ObjC for mpv and need some ground rules... what is the oldest version of OS X mpv intends to support? Clearly there's no ARC being used, are there any other guidelines I should be following? And looking at macosx_application_objc.h, there's an mpv-specific NSApplication for the mpv standalone. It has some properties used by the input subsystem/Apple remote/media key code: @property(nonatomic, assign) struct input_ctx *inputContext;
@property(nonatomic, retain) EventsResponder *eventsResponder;
// ...
@property(nonatomic, retain) NSCondition *input_ready; Is it cool if I move these out of NSApplication and just make them on-demand singletons or something? In the non-standalone case, they couple things together unnecessarily (things break in my test program because my test app doesn't implement |
No singletons please, it's a terrible pattern :) Can you post your test application somewhere so that I can use it as a test? I think that maybe when using mpv as a library we want to disable most of the initialization of mpv's NSApp stuff, I'd still like clients to get most of the stuff from And I agree, at the moment the code is a little bit too coupled, but I am unsure how to modify it because I am not sure what would be the most convenient API for clients. |
I agree with what you're saying. NSApp is already a singleton, so those three variables are de facto globals... I think they should just live in a separate file. Is GCD okay to use? What versions of OS X does mpv support? My test app is just DOCS/client_api_examples/simple.c pasted into a Cocoa skeleton right now. It isn't really worth seeing... |
Yep, it was ugly but not detrimental until we got the client API. Now that the client API is there they should not be accessible from global constants.
Yes! As a matter of fact GCD is already used in
At the moment 10.7+. But that's not strict, now that 10.9 has been out for some time we can consider bumping the requirement to 10.8+ if we need some newer APIs. |
Ping? |
I've been way too busy to get very far... will attempt some more the day after tomorrow. |
Can this issue be closed now? Also, @frau, maybe it would make sense to include your cocoa example under DOCS/client_api_examples as well. |
From what I understand, the bare client API works, but showing video with it doesn't. |
Or should this remain open until the API is fully functional...? |
Well, the original issue was that you couldn't use the API at all, now you can, so that's solved. Any improvements on that are separate issues IMO. |
When attempting
./waf build
, I get a NotImplemented error. I tracked this down todo_the_symbol_stuff
waftools/syms.py
not supporting the mac-o binary format, but I'm not sure how to implement that...The text was updated successfully, but these errors were encountered: