Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can't use client API on OS X #556

Closed
frau opened this issue Feb 16, 2014 · 27 comments
Closed

Can't use client API on OS X #556

frau opened this issue Feb 16, 2014 · 27 comments
Labels

Comments

@frau
Copy link
Contributor

frau commented Feb 16, 2014

When attempting ./waf build, I get a NotImplemented error. I tracked this down to do_the_symbol_stuff waftools/syms.py not supporting the mac-o binary format, but I'm not sure how to implement that...

@ghost
Copy link

ghost commented Feb 16, 2014

Yeah, that thing is causing nothing but problems. It seems this is a waf deficiency.

But even without it, OSX wouldn't work too well, because of strange cocoa issues. We haven't figured out yet how to use cocoa with the client API. It needs work.

@ghost ghost added the osx label Feb 16, 2014
@frau
Copy link
Contributor Author

frau commented Feb 16, 2014

Any way I can help with Cocoa?

@ghost
Copy link

ghost commented Feb 16, 2014

The problem is how to integrate the cocoa video output window (implemented by video/out/cocoa_common.m and some other files) with the application that uses the client API.

On win32 and X11, embedding the VO window is pretty simple: the host application has to provide a parent window handle, and mpv will just create the VO window as sub window. This happens with the --wid option.

For cocoa, this is not so simple. I don't know too much about cocoa, but the problem starts with the fact that cocoa requires the use of a NSApp singleton. It looks like the host application will usually create this singleton (if that application is a GUI), so mpv obviously can't create it. We probably need to refactor the mpv cocoa code a bit, so that mpv can provide some sort of embeddable cocoa widget.

So, basically, we have 2 problems:

  1. Figuring out a way to embed a mpv cocoa window in the host.
  2. Changing the mpv cocoa code to implement it.

Note that smplayer on OSX doesn't embed anything. Instead, the video data is transferred over shared memory, and rendered by smplayer itself. We don't really want that.

@frau
Copy link
Contributor Author

frau commented Feb 16, 2014

Ah, embedding into Cocoa views is going to be fun. Neat. I'm going to poke around the .m files for a bit and see what you are doing currently.

Somewhat related to (1), I'm actually looking to render the video as a textured quad in an OpenGL scene rather than putting it in a Cocoa window; how could that be reconciled with client API embedding? I see mpv already has GL render drivers, but I'm guessing that exposing them to the client API would be a messy task...?

@ghost
Copy link

ghost commented Feb 16, 2014

I'm actually looking to render the video as a textured quad in an OpenGL scene rather than putting it in a Cocoa window; how could that be reconciled with client API embedding?

This would probably also be possible. I think using the same GL context for application and mpv would be too messy, but one could use a shared GL context. Which means mpv creates its own GL context, but so that GL objects are explicitly shared with the GL context of the application. Another possibility would be using IOSurfaces to exchange image data between mpv and application. In both cases, mpv would pass rendered images with timestamps to the client API user. Handling initialization and passing of image data would probably require extra APIs.

@frau
Copy link
Contributor Author

frau commented Feb 16, 2014

Oh, interesting! IOSurfaces sound like a great way to handle both Cocoa and raw GL cases. It would confine Mac embedders to only use the GL driver though. Hmm. CGLTexImageIOSurface2D looks like just the thing for getting mpv video into a scene, and CGLIOSurface.h has nice docs for it.

@ghost
Copy link

ghost commented Feb 16, 2014

Currently, all GUI VOs use OpenGL on OSX anyway. X11 still works on OSX, but I doubt anyone actually wants to use this.

@pigoz
Copy link
Member

pigoz commented Feb 19, 2014

@frau: now with 08170c6 it builds. Unfortunately it seems to deadlock in the example program. Will have a closer look at what is wrong.

@frau
Copy link
Contributor Author

frau commented Mar 7, 2014

@pigoz Neat! Thanks.

@ghost ghost mentioned this issue Mar 11, 2014
@ghost
Copy link

ghost commented Apr 8, 2014

We could need some help with this.

@frau
Copy link
Contributor Author

frau commented Apr 8, 2014

Still deadlocking? I can give it a shot.

I tried looking at it with Instruments a month or two ago but didn't see anything that stood out. I'll give it a go again and see what I can find.

@ghost
Copy link

ghost commented Apr 8, 2014

Still deadlocking?

Well, the main problem is that cocoa requires the application to create a global NSApplication object, and this object must be created in the main thread (that is, the thread that is created first by the OS, and in which the C main() function runs). Since libmpv is supposed to be embedded into other GUIs (no matter whether they're native cocoa, or something like Qt). The mpv cocoa parts that are used by vo_opengl also depend on such a NSApplication object.

Currently, mpv probably deadlocks because the NSApplication object is explicitly not created when starting mpv with libmpv (mpv_create()), and it waits for a condition that never becomes true, or something like this.

This probably can't be easily solved, and will probably require some more complicated interaction between the host application and libmpv.

@ghost
Copy link

ghost commented Apr 8, 2014

Currently, mpv probably deadlocks because the NSApplication object is explicitly not created when starting mpv with libmpv (mpv_create())

Oh, and maybe I didn't make this clear: the host application creates the NSApplication object. Now how can libmpv make use of it, and how can it create and embed its own cocoa widget or whatever is needed to create a GL context and to display video?

@frau
Copy link
Contributor Author

frau commented Apr 8, 2014

In the libmpv case, MPV can just obtain the singleton with [NSApplication sharedApplication] and all should be well... Embedding is another story. I will attempt a simple GL context implementation.

@ghost
Copy link

ghost commented Apr 8, 2014

I see two general approaches to this (API-wise):

  1. mpv provides some kind of cocoa widget, that cocoa applications can embed.
  2. A special VO backend that requires the application to provide a bunch of callbacks. These callbacks would allow mpv to setup an opengl context etc., and would be basically be independent from any GUI toolkit.

I think 1. would be better for Cocoa (if possible). 2. would be more generic and could be reused on other operating systems, but would also require for more code and interaction from the application.

@frau
Copy link
Contributor Author

frau commented Apr 9, 2014

Well, 2. is what I want to use, and I think most people will really want to use for the sake of being cross platform. 1. is a bridge that can be crossed when someone wants it. What strategy does the included mpv OS X app bundle take, anyway?

@frau
Copy link
Contributor Author

frau commented Apr 9, 2014

So I'm looking at writing some ObjC for mpv and need some ground rules... what is the oldest version of OS X mpv intends to support? Clearly there's no ARC being used, are there any other guidelines I should be following?

And looking at macosx_application_objc.h, there's an mpv-specific NSApplication for the mpv standalone. It has some properties used by the input subsystem/Apple remote/media key code:

@property(nonatomic, assign) struct input_ctx *inputContext;
@property(nonatomic, retain) EventsResponder *eventsResponder;
// ...
@property(nonatomic, retain) NSCondition *input_ready;

Is it cool if I move these out of NSApplication and just make them on-demand singletons or something? In the non-standalone case, they couple things together unnecessarily (things break in my test program because my test app doesn't implement eventsResponder, and so on.)

@pigoz
Copy link
Member

pigoz commented Apr 9, 2014

No singletons please, it's a terrible pattern :) Can you post your test application somewhere so that I can use it as a test?

I think that maybe when using mpv as a library we want to disable most of the initialization of mpv's NSApp stuff, I'd still like clients to get most of the stuff from macosx_events for free since that seems quite useful.

And I agree, at the moment the code is a little bit too coupled, but I am unsure how to modify it because I am not sure what would be the most convenient API for clients.

@frau
Copy link
Contributor Author

frau commented Apr 9, 2014

I agree with what you're saying. NSApp is already a singleton, so those three variables are de facto globals... I think they should just live in a separate file.

Is GCD okay to use? What versions of OS X does mpv support?

My test app is just DOCS/client_api_examples/simple.c pasted into a Cocoa skeleton right now. It isn't really worth seeing...

@pigoz
Copy link
Member

pigoz commented Apr 9, 2014

NSApp is already a singleton, so those three variables are de facto globals

Yep, it was ugly but not detrimental until we got the client API. Now that the client API is there they should not be accessible from global constants.

Is GCD okay to use?

Yes! As a matter of fact GCD is already used in video/out/cocoa_common.m.

What versions of OS X does mpv support?

At the moment 10.7+. But that's not strict, now that 10.9 has been out for some time we can consider bumping the requirement to 10.8+ if we need some newer APIs.

@ghost
Copy link

ghost commented Apr 13, 2014

Ping?

@frau
Copy link
Contributor Author

frau commented Apr 13, 2014

I've been way too busy to get very far... will attempt some more the day after tomorrow.

@ghedo
Copy link
Member

ghedo commented Aug 10, 2014

Can this issue be closed now? Also, @frau, maybe it would make sense to include your cocoa example under DOCS/client_api_examples as well.

@ghost
Copy link

ghost commented Aug 10, 2014

From what I understand, the bare client API works, but showing video with it doesn't.

@frau
Copy link
Contributor Author

frau commented Aug 11, 2014

@ghedo I suppose it can, as the original intent of this issue has been dealt with by @pigoz. Submitted the example code as #1001, which actually does render a video window now!

@frau frau closed this as completed Aug 11, 2014
@frau
Copy link
Contributor Author

frau commented Aug 11, 2014

Or should this remain open until the API is fully functional...?

@ghedo
Copy link
Member

ghedo commented Aug 11, 2014

Well, the original issue was that you couldn't use the API at all, now you can, so that's solved. Any improvements on that are separate issues IMO.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

3 participants