Any update/timeline on when apple silicon will be fully supported by Mimolive (and native)? I’m currently using a 2014 iMac with a fusion drive that is starting to cause me issues (not sure if it is the fusion drive or the iMac itself). I’ll continue to troubleshoot the issue, but am starting to turn an eye to a new computer to use with Mimolive. I’m considering one of the new M1 mac mini’s but obviously don’t want to jump until fully supported. I’m probably (hopefully!) on a 6 to 12 month timeline but I just am hoping to know if this computer dies prematurely and I need to make a run to the apple store what to buy
We do not yet have a timeline for native Apple Silicon support. The first step is going to be to make sure mimoLive works well under Rosetta 2 on the M1 models. This is high priority.
Native support depends on a number of third party frameworks becoming available for Apple Silicon.
Thanks Oliver! Looks like if this Mac dies, I’ll be replacing it with an intel for now.
Just curious about this, too. It looks like I’ll have a decked-out, 16GB/2TB M1 mini arriving here at the office and my temptation is to use it in the studio. But obviously mimoLive support is crucial, and the 8-core 2019 iMac isn’t exactly a slouch.
Any idea how mimoLive is performing under Rosetta 2? And/or any idea of an updated M1 timeline, @Oliver_Boinx?
No update on the timeline yet.
But I just spoke with a customer yesterday doing his live stream Tuesday on an M1 with 2 guests via mimoCall, a caller via Skype/NDI and a local camera with chroma key: https://www.youtube.com/watch?v=WNokwvdqG5U
So, it really depends on what you’re doing. Could work, but there is a good chance that you will get over that tipping point where the load on the machine is killing it.
I recently purchased a Mac Mini M1, the 8GB RAM version, and so far I did not run into any issues. Worth noting, I always run two MimoLive instances and currently on two different machines. The M1 solely streams via MimoLive to any RTMP server, and sometimes also records the show. I use another machine in combination with an UltraStudio HD to do all the graphics via MimoLive, outputting a key and fill signal.
Hi @Hindrik would you mind to share a bit more details about your setup? I was thinking to do something similar, using a Mac for mixing and adding graphics and another one for streaming/recording. How do you pass AV signals between your two Macs?
You can use mimoCall to send from one mimoLive instance to another. An alternative would be BMD UltraStudio Mini Recorders.
Would you mind sharing more about your setup and the connection between your machines and mimoLive.
Tested on my M1 MacMini 16GB RAM. Recording performance to my Thunderbolt 3 drive no good. Hoping for native M1 support soon. Anyone else see this performance issues?
Hi @Philip_Ohler We are aware of performance issues on M1 and are working on it. The severity of the issues depends on your mimoLive document, it is not the recording per se that has performance issues. It depends on a number of factors.
Thanks Oliver. I was testing a simple NDI input from Microsoft Teams. Drive performance suffered on only a single record output.
This could be because NDI is using up the processing resources. NDI, especially Full NDI is using software encoding/decoding which is currently not M1 native. Are you running both mimoLive and MS Teams on the same computer?
Hi @Gabriele_LS and @hutchinson.james_boi, sorry for the late reply. We’ve been really busy the last few weeks with a livestreaming production, which is great of course!
I’d love to share something about my current setup; however, it does not include multiple MimoLive instances anymore. Let me give you a small overview of my different approaches that I’ve used in the past to provide some context.
I’ve been using MimoLive from version 2.8 since 2017 together with an UltraStudio Mini Recorder. We did not produce livestreams often back then. It wasn’t until the pandemic hit when livestreams were more sought after. Additionally, the production value also increased.
My first setup included an ATEM Mini Pro that I brought back into MimoLive via the USB-C output. I did not use the hardware encoder, because I liked the flexibility of MimoLive more. One major issue that eventually made me move forward, was the color range problem with the ATEM Mini series. In short, the problem concerned a difference in color range output by the device and interpretation by the webcam emulation software, which resulted in terrible color dynamics mainly concerning the white and black levels (read more via: Blackmagic Forum • View topic - ATEM Mini - USB-C Output Contrast and Delay). That’s where I decided to upgrade.
After purchasing an ATEM Mini Television Studio HD and an UltraStudio HD Mini, I came up with the following setup. I used MimoLive to generate all our media content and displaying it via the UltraStudio HD Mini by generating SDI playout (fill & key). It’s great to have your visuals on the multiview, especially when you use an image director. The program output of the ATEM was then again routed into the UltraStudio HD Mini and used as an input source in another MimoLive session. BlackMagic claims it is impossible to use SDI playout and capture simultaneously, but it works perfectly fine. Since I use a Mac Mini (2018, Sonnet RackMac enclosure), I needed a boost in graphic performance and thus I used Sonnet’s eGFX in combination with the VEGA56 graphics card. This also gave me some extra outputs to utilize.
Suddenly it seemed I was doing a lot of processing on one computer. I make use of a lot of automations and API connections, which kind of scares me. Therefore, I purchased a new Mac Mini M1 (I mean what’s the point of purchasing a more expensive Intel Mac Mini). I used MimoLive on that Mac Mini to solely stream and record by using an UltraStudio Mini Recorder 3G. Honestly, the results were quite bad. All our productions are set to 1080p 50fps, mainly because I don’t like to use a lot of deinterlacers. This means the bandwith of the data is quite high, and the M1 was not capable enough of both streaming and recording simultaneously. To be fair, it did the job, but with loads of dropouts, mainly during recording.
In the meantime, there was another problem to overcome, namely video playback on a television in the studio. A lot of our clients like to have a television behind them in the set, displaying a logo, slides, video playback etc. This requires a separate mix that I tried to achieve by using MimoLive in creative ways. I had three MimoLive instances running at the same time, one for streaming and recording (which wasn’t performing to our expectations), one for visuals, and one for the television. I used Syphon to transfer the video data and loopback to transfer audio. Additionally, I used a load of automations and a complex Bitfocus Companion setup to make everything work, and that’s where I learned that I was leaning too much on MimoLive.
The multiview on top of the eGFX is from a borrowed ATEM Mini Pro. I purchased the Web Presenter HD as a replacement.
So, I purchased ProPresenter 7, and it’s great! One Mac Mini (2018) is displaying all the graphic overlays for the livestream (no need for the eGPU anymore) and I use the API documentation to connect it to Bitfocus Companion. The other Mac Mini (M1) is running ProPresenter 7, outputting 3 SDI signals via a BlackMagic DeckLink Duo 2 (enclosed in the Sonnet eGFX). One signal for playback purely for the stream, one signal for the multiview displaying the timers and cue’s, and one signal directly to the television.
As you can see in the pictures, everything is held together in one 6U flight case. I recently ordered another 6U flight case and plan on separating “final processing” from “content creation”. Additionally, I plan on fully utilizing Dante for audio routing. Loopback is fun and all, but it’s not meant for professional use. Lastly, I plan on developing a MimoLive isntance for Bitfocus Companion. Boinx recently released a Streamdeck plug-in, but Bitfocus Companion is just leading.
I hope this gives you a little insight in how I work and use MimoLive in practice. I definitely wouldn’t have gotten so far if @Oliver_Boinx and @Achim_Boinx weren’t so closely involed in the community.
Hello @Hindrik, thank you for sharing such a detailed description of your setup(s). The development path you followed is quite interesting. I would have a lot of questions, but I don’t want to bother you too much. Anyway, would you mind sharing a bit of information about the kind of events you stream to the web? What kind of A/V inputs do you have?
You can use NDI or Syphon to do that.
You’re very much welcome. It’s always good to reflect. Ask away, I have nothing to hide so the questions shouldn’t be troublesome to answer.
Concerning the kind of events, it differs a lot. We’ve done multiple corporate events for companies that want to engage with their consumers in a more interactive way during times of the pandemic. This can either be through Zoom or a more open CDN with proper interactive options, e.g. Vimeo. The third picture of my post is from last week, where we were streaming liberation day in the Netherlands. We had 11 bands performing live music in a popvenue (where I also work). We pre-recorded 6 of them and did 5 performances live. Playback via ProPresenter is amazing because you can easily output a timer showing the remaining time of the video, which is amazing for the host.
The A/V patch currently looks as follows. Please note that the first 4 inputs are HDMI, followed by 4 SDI in- and outputs.
- Fill (MimoLive)
- Key (MimoLive)
- Playback (ProPresenter 7)
- Stage timer (ProPresenter 7)
- Camera 1
- Camera 2
- Camera 3
- Camera 4
Not really, NDI isn’t officially supported yet, at least not by MimoLive. ProPresenter recently (about a week ago) released NDI support for M1, however this can only be used for playback. Additionally, I don’t really like the latency NDI brings around. Syphon would be way better, however I prefer to use hardware routing.
Hi @Hindrik NDI on the M1 with mimoLive works since version 5.10 from February 2021.
So, are you using ProPresenter as a player, in a way similar to how you would use Qlab?