A discussion about building realtime video products at the Mux TMI event.
Joe Burfitt, Modyfi
Jenny: Thanks, Phil. Now we get to see what’s already been built with this API. We’ve got two demos today that will show different use cases and the constraints that come with them. We’ll see where there’s been friction and where things just worked. First up, we have Joe Burfitt from Modyfi. Joe is a longtime friend and advisor of Mux. Previously, he’s built video products at Amazon, Snap, and Quibi, and for the past six months, he’s been building something new, and until today, secret, as CEO and founder of his new company, Modyfi. We’re excited to hear all about it. Let’s welcome Joe.
Joe: Thanks, Jenny. I’m Joe from Modyfi. We’re an early-stage startup focused on building out a new platform for content creators, enabling them to work together to create digital masterpieces. Creativity is what Modyfi cares about the most. But in today’s modern web apps, collaboration is key. We often lose context with comments, and just getting into the details. So, it’s important to be able to talk or at least have a video call to convey that information. Other traditional conference products take minutes to create the link, everybody joins, you share your screen, make sure everybody can see your screen, and then you have a kind of crappy 15 frames a second… Right, that’s not going to be useful to portray anything creative. So, it’s more important than ever to have the tools that you need in the place where you’re actually working. So that you can work asynchronously and synchronously together with no barriers to entry. That’s why we partnered with Mux. To enable a real-time video calling within Modyfi, Modyfi will concentrate on building those creative products and take advantage of an incredible power powerful platform which is Mux. This has built power video calling and allowed quickly to talk through what they’re working on and need help with. So, let’s get into a demo. Some rebranding, I know that John loves hot sauce, coffee, and probably likes his face. So, this is what we can do. So, quickly, we can check the box logo at the top. We’re going to mask it off. Those coffee beans are a bit dark in the middle. Let’s get some feedback from my colleagues. Okay, I added them in and joined the call. Now they’re going to look around. They’re going to add some comments. Okay, there’s a bit of confusion here, like okay, what do we actually want to talk about? Let’s just join a call. So, they all join in. Again, this is powered by Mux. It is very quick and easy to set up. It took us a week to get going with this. I think they are coalescing around the hot sauce. Let’s try this in a few different colors, though. So, maybe green? No, that’s not the Mux brand. So, we’ll play around with that. I think I like a pink. I think that’s more on brand. There we go. So, we’ve got Mux’s new one. Sorry John, it’s not your face, it’s going to be the hot sauce. We couldn’t be happier with spaces. It’s taken an incredibly short amount of time to be able to get this work in. It saves us months of work to do it ourselves, taking the burden off as we push it out onto Mux to actually solve these problems, and then lets us concentrate on making a spectacular creative platform.
Jenny: This is really cool, Joe. This is awesome. So, the first question I’m wondering, and maybe everyone else, is when are we all going to be able to try this?
Joe: We’re still building. But we’re aiming to get a private beta out this year. So, if you’re interested, feel free to email me at [email protected] and we’ll put you on the list. So, when we go live, you can try it out and give us feedback.
Jenny: Awesome. Great. So, besides building in the real-time part, what’s it been like to build the Modyfi product? What’s been the most challenging part of that process for you?
Joe: All of it. Yeah, like if we look at other companies in this space, they are decades-old, there’s a lot of technology that has already been built and we’re trying to catch up, but also think about it in a modern-day environment. So, how do we approach things completely from scratch? That’s why it’s taken us six months. That’s pretty fast to see where we’ve got to right now. We’ve probably got a few more months before we get into that beta phase. But yeah, everything is a challenge.
Jenny: Yeah, it sounds like you’ve had a lot of those. So, knowing that you’ve got a lot to do, a lot to build, and a lot of challenges, what led to the decision to integrate real-time now?
Joe: Honestly, we wouldn’t have done it at this point in time. I think we would have waited much later. It was probably like a P5 on our list of things to build. It was only our tight connection with Mux and then enabling it so easy to build that we’ve made the decision to bring it forward and put it into the application now to make that collaboration much easier.
Jenny: That’s great. I’m glad that you’ve been able to get that in there. So, for folks who are just starting out on their journeys of building collaboration into their tools or something else, what’s something that when you started out, you wish you would have known that you could maybe share with them?
Joe: Concentrate on what your key product is so you can modify its creativity and then outsource as much of that technical uplift to other vendors like Mux. Ease the burden on yourself and make sure you concentrate on the things which are going to be the differentiator for your platform.
Jenny: Awesome, great advice. Thank you so much for being here, Joe. Joe: Thank you very much, Jenny.
Jenny: Next up, one of our partners is going to demo the very first application built with the real-time API. This is the one that Phil mentioned earlier, Mux Meet. Because we’re building a video product for developers, of course, our team wanted to use the tool ourselves. So, the Mux team built a basic meeting app that we could use for our daily stand-ups. Here you can see some of those early and somewhat embarrassing screenshots from that. But it helped us find and fix bugs quickly, and it helped us improve the SDK along the way. But what we needed to do was really accelerate development so that this tool would be ready for our whole company to use and for all of you to use as a reference project, and that’s where our partner Quarkworks comes in. Brett Koonce is the CTO and co-founder of Quarkworks. He and his team have experience building several products using WebRTC, including Houseparty and Reddit Live. So, we thought it would be a perfect fit to take our team side project to the next level. He’s here to tell us a little bit more about that. Welcome, Brett.
Brett: Hello, and thank you all for having me. What we’re going to show here first is just a demo of how to set up the whole Mux Meet experience. At a high level, we’re simply going to GIT-clone the repo. So, to pull the code down from the internet. Next, we need to connect to the Mux API. So, for this, we’ll need to set up some keys and signing access stuff. So, we’re going to copy the example environment file over, and then from here, we’re going to edit it for our specific implementation. So, we’ll jump over to the Mux API Dashboard, the spaces API, and we’ll set up a set of API access tokens. This is needed for all projects on Mux in general. We’ll generate the token. We’ll copy it, then we’ll paste it into our project, and then repeat this process as well for our secret key. This real-time streaming API also needs its own little set of signing keys. So, we’re going to also need to generate a public-private key pair and add them to our project as well. So, for this, we’ll use the dashboard again to generate a key, and then once again we’ll simply copy things over to our environmental variables for our project. Here’s the long, ugly one. and violá. Now we’ve done all the configuration necessary in order to take our demo project from the internet and use it ourselves. The final little step we’ll need to do here is to manually create a new actual space to be using for your project, and so, we’ve done that now on the back end. From here, the process should be relatively straightforward. We do our standard npm install process. We pull some plugins and whatnot down from the internet, and then finally we can launch node in development mode and see stuff locally. So, we’ll load things up here and, wallah, we have Phil appear on the screen. That’s literally all that’s needed, and then just to sort of showcase our tech, we’ll add in another person. So, here’s Jared. I believe they’re talking about code here. So, we’ll actually add the computer itself as a video source, so to speak. So, we’ll share the screen, and there we go. Now we can look at our React app and pair program on it together, and then modify things to our heart’s content. Anyway, this is all pretty turnkey. In just a few minutes, we’re able to get up and running. Thanks again, Phil and Jared.
Jenny: Awesome, thanks for walking us through that demo, Brett. So, since this isn’t your first project that you’ve been building with real-time, can you tell us a little bit more about how you’ve integrated real-time in the past, and the challenges with those projects and how it compares?
Brett: Okay. My experience with real-time technology in general is that things really start off simple, but they get really complicated in a hurry. One on one on a single platform is generally a good place to get started. But then, as you start trying to add more clients and peers to the project, things can really start to get interesting in a hurry. Real-time video technology in general is built up on top of a whole large and complicated set of both hardware and software working together in order to make all this stuff work. As an engineer, I think we sort of love this idea of having clean, beautiful API’s to work with and being able to have the perfect interface. But my experience with real-time tech is that more often than not we have to go in the opposite direction. We have to get our hands dirty and go and down into the internals, dealing with C++ templates or low-level networking details. And then, even whenever our tech all works, deploying into the world brings its own set of challenges. Bandwidth, latency, cloud gremlins. There’s a whole sea of monsters out there that are wanting to eat your packets before they get to the end customer. I’m not going to say this stuff is impossible, certainly not. But real-time systems have a very unique set of challenges that make them a particularly interesting problem to try and solve.
Jenny: Yeah, it sounds like. So, going forward, what use cases or applications do you see your team being able to use the Mux real-time API for in the future?
Brett: To me, videos are an extremely common feature request, especially for consumer apps. People love video. Having said that, my experience is also that people do not have a very clear vision of how video is going to fit into their project from day one or how all this stuff is going to work together on the backend. So, what this means in practice is that, more often than not, you’re going to have to end up rewriting things a few times on the way. So, in the world of startups, you sort of have this concept of innovation tokens. I think this concept can be expressed quite simply as saying, “Are you spending your precious time and energy on the one thing that makes your app unique?” So, to me, the joy of this API is quite simple. We did the demo earlier, and in just a few minutes you were able to get an enormous amount of functionality up and running out of the box. So, on day one, you can have all this. Where you go from there is up to you.
Jenny: Awesome. Well, thank you so much, Brett, and thank you to everyone at QuarkWorks for making this project awesome. We really appreciate it. Thank you. Okay, you’ve seen a couple of examples of what you can do with the API scenes and how it can get out of your way to let you focus on your core business problems, and you’ve got a preview of the open-source project that you can use to get started today, and I hope you do, and I’m really excited to see what feedback you’re going to bring for us.