Is there a roadmap for providing a browser source for MimoLive? While I know there is a workaround using screencapture, it’s not ideal. For example, streams that use HTTP assets like streamlabs or muxy should be used with a browser source. I’ve been trying the screencapture workaround and I have to set it up each time I restart my computer (since the browser window gets closed and opened as a result, and ML needs to be told which window to capture again).
There are many items in esports/gaming that I would use a browser source to integrate:
I know that these things are meant to be able to be achieved using Quartz but considering the amount of html assets already available (and the wide range of devs that can do html as opposed to Quartz) it would be ideal if a browser source was available in ML.
Here’s another example of the use of a browser source - https://strexm.tv/
Gaming streamers currently can’t leverage this functionality with MimoLive (I don’t think). Might be possible I suppose with the right chroma-keying set up using screengrab - but this is not ideal.
It would be great if Boinx could provide a response to this question
I would also love to see this feature added as it would make it easier to use many streaming tools that are available. Including various chat browser sources for integrating chat into your stream and various overlay and alert functionality offered by numerous streaming and third party services.
I recently uncovered CoGeWebkit which is a Quartz Composer Patch that implements a WebKit browser instance within Quartz. I attempted to use this to build a custom mimoLive Browser Source Layer. The problem that I ran into was that CoGeWebKit has not been maintained in nearly 7 years and as such apparently some of the frameworks it was using at one point in time have changed and it no longer functions properly. Partial renders of web content into a Billboard patch were the most I could get.
I later stumbled upon the source code for CoGeWebkit and started developing my own custom QC Patch using CoGe as a guide. Then I uncovered several articles pointing out that the WebView class that CoGe is using has been deprecated as of MacOS 10.10 and that WKWebView should be used instead.
I then started researching WKWebView and discovered that in MacOS 10.13 a new method is being added that will apparently allow a call to take a snapshot of the viewport of a WKWebView which seems like it would be a good fit for outputting the image data in the custom QC Patch. I just updated to the Gold Master of MacOS 10.13 beta and am continuing my attempt at developing the QC Plugin that will allow for browser rendering in QC that I can then tie into a custom layer for mimoLive.
I have very limited experience with Objective C and this is really my first project in this language, however I will not stop development on this until I make it work or until Boinx decides to build a browser source into their product.
Now for my reasoning. I am a Twitch streamer and the vast majority of on stream alert systems are accessed via browser functionality. This means that without a browser source I am extremely limited in this area. At one point I was running 3 separate window captures with chromakeying to work around this. Each one was pointed at a separate page with some custom CSS injected so that I could chromakey them in mimoLive to allow them to sit over my stream. This resulted in a very poor quality stream due to the amount of CPU usage needed to do multiple window captures within mimoLive. I later worked around some of this issue by using a local html document with multiple iFrames to consolidate all of these alert/overlay pages into a single window for capturing and chromakeying. This helped, quite a bit. However I recently moved my stream from a mid-2011 iMac with an i7 to a 2016 15 inch MacBook Pro and believe it or not the MacBook Pro seems to be just a tiny bit slower than the much older iMac in terms of CPU (at least with regard to mimoLive usage. This has resulted in my stream quality decreasing significantly as I am maxing out my CPU when streaming. MimoLive manages to utilize around 250% CPU in my current setup and as such my system typically has less than 1% idle CPU while streaming.
Now that I have upgraded to MacOS 10.13 my idle CPU is even less. I suspect that some of this is due to some data collection and debugging processes in the beta build but even so, I cannot stream with my full setup from a maxed out less than 1 year old laptop with my streaming software using this much CPU the result on a 720p 30fps stream is that my audio gets out of sync with the camera and mimoLive usually drops FPS output to around 20 and the quality degrades significantly because it cannot keep up with the inputs while encoding and doing a single window capture with chromakey.
I will try to keep this thread updated on my development progress of both the QC Patch and the subsequent mimoLive layer.
@Stimpy Thanks for your efforts! Actually we tried to develop a Web-Browser-Plugin for Quartz Composer some years ago but unfortunately it turned out not to be stable so we never released it. As far as I can remember the problem was that the web view that will be rendered needs to be rendered in the main thread of the application but the video pipeline in mimoLive is running on a background thread and with it all the Quartz Composer stuff. Thats a bit tricky.
It may actually a better way to have a native web browser source rather than the need for a QuartzComposer plugin. Sure enough we have it on our wish list!
@Stimpy the workaround I have in place right now is that I don’t use ML to stream. I use it to generate my graphics, scenes etc, but I output to an i7 NUC device and stream from that using xsplit. Xsplit has a native browser source so my xsplit scene has the ML output as bottom layer and the stream tools browser page as the top layer. The NUC streams at 720p x 60fps but no more before maxing CPU. If you have a spare PC with a discrete graphics card you’d easily manage 1080p x 60fps.
Still, the best result would be the browser source for MimoLive, it’s what I’d prefer!
@chrisis How are you getting the audio and video from mimoLive into the NUC? A capture device of some kind? Network stream like NDI or RTMP?
maybe try an iPad put on the site and then set as a source like you would anything else? https://docs.mimo.live/v2.0/docs/ios-device-video
@kmac1036 The problem with that is the way that most of the web based stream tools work, they require something that can handle a transparent background. They operate more as overlays than actual web content.
I have taken @chrisis idea and sort of worked out a way to make it work. I have done a little testing but no actual streams yet with the new setup.
My new setup.
mimoLive on a MacBook Pro is outputting to a custom RTMP server running on a Raspberry Pi on my local network.
OBS running on an iMac is injesting the RTMP stream from the Raspberry Pi as a media source, then within OBS I have the Browser Source items configured to overlay on top of the RTMP input from the Pi/mimoLive. Then OBS broadcasts out to Twitch.
What testing I have done shows significant promise. The extra RTMP steps does add about 2 extra seconds of latency between what I do on camera and when the people see the output but since the only thing OBS is adding doesn’t need to sync up with the other content it doesn’t really matter all that much.
Also using RTMP to move the main stream between the systems means I didn’t have to spend extra money on another capture device.
@Stimpy I wonder if there is a way to use NDI instead of RTMP to move the video around. For example, not sure if it would work to use the WebNDI app on an iPad to send the Web content to mimoLive? Problem might be transparency. Don’t know how a browser renders a transparent web page. But if the web page is keyable, that would be a possibility.
ok, didn’t realize the transparency effect was also desired. another dumb idea: try using the chroma key feature? then you could keep it all “in house” with mimolive… just a thought I had.
Actually my previous setup was a local html doc containing multiple iframes for different pages and CSS override for a green background. This was displayed on its own little 1080p monitor. Then I would use a Display Capture source in mimoLive with Chroma Key Effect and some masking.
The issue with this was the CPU usage from ChromaKey Effect was far too high for my stream setup. I would see nearly a 2X increase in CPU usage for mimoLive when this source was active and approximately 75% of that increase was due to the ChromaKey effect. I typically have at least 3 video sources sometimes up to 5 or 6 active at any time so I am already pushing my system to the limit and needed a way to either offload this work or make it more efficient.
So I switched to outputting from mimoLive via an RTMP server running on my local network, then ingesting that into OBS on another system, adding the Browser Source layers in OBS, then outputting to Twitch from OBS. This has managed to significantly decrease the CPU usage of mimoLive. I still have some tuning to do to get the stream stable and smooth but it is working better so far than my old solution with Chroma Keying. The only major drawback currently is a slight increase in stream delay due to the extra RTMP hop. Now my stream delay runs about 10 seconds instead of 8.
As for Browser Window Transparency there is no reason this isn’t possible, but most standalone browsers do not implement it (possibly due to security concerns since having an invisible browser window running on your system could be scary if you don’t know what it is doing). OBS appears to use CEF which is an embedded Chromium implementation. I don’t see why this or some sort of helper application utilizing WebKit couldn’t be worked into a new Source or Layer type for mimoLive.
As for NDI, that would help eliminate some of the Delay, but I would need a more expensive license for mimoLive and some sort of receiving software on the OBS end. Most of the NDI software I have seen is somewhat expensive. Then I would likely have to do a Window Capture of that into OBS along with some sort of Audio Capture (Both of which would be less efficient CPU wise and likely have synchronization issues that I would have to work out). The NDI browser on a tablet would be a good idea if I were just planning on sharing an entire webpage to a stream, but that is not what Twitch streamers use Browser Sources for. They are used for Overlay elements with transparency to add additional layers if information and interaction with their viewers.
@Stimpy Thank you for the detailed information about your use case. Can you give us a sample HTML file that you use to help us find out if our web source could help you there?
Also regarding NDI: We are currently changing the requirement so that NDI will work on all licenses of mimoLive.
@Stimpy xsplit can ingest over NDI. So can VMIX.
But really what we need is a browser source implementation from Boinx. All competitor products can do it and we are all doing workarounds or spending money on extra equipment because mimoLive can’t doesn’t have one.
@imchrisis Yes I realize that those can handle NDI input but I really don’t want to have to pay for an additional streaming software.
I did find an NDI plugin for OBS that will allow for ingest of NDI, but in brief testing the quality was very poor. Low and inconsistent frame rate even over 1Gbps wired connection. I suspect the problem was on the OBS receiving side but will need to test more with 3.2b2. I had several problems with 3.2b1 that seemed to originate with the fact that I had an external GPU connected. I could not get any documents to load in 3.2b1 when I had the eGPU connected. mimoLive would crash when opening a document (even a new blank one).
Agreed we definitely need a Browser Source implementation in mimoLive.
@“Oliver Breidenbach” I would rather not post the HTML samples to the forum as they may have sensitive API key data included. Is there a more secure way that I can get those to you?
Otherwise, the majority of the Browser Source items I am using are from StreamLabs.com. Widgets like their Event List, AlertBox, The Jar, Goals, etc. There are some other services that provide similar functionality such as Muxy. I also would like to include possible Browser Sources from the bot software I use (Phantombot) which would allow for viewer chat triggered custom animations (in GIF format with mp3 audio). Also Twitch has several new 3rd party developed overlay extensions which make use of Browser Sources (Smart Click Maps https://twitch.exmachina.nl/ being the one I am most interested in).