So i've been using the current Studio Artist V5 64 bit build to process movies quite a bit recently. So i thought i'd start a forum discussion on this topic, because i think a lot of our users are mis-informed about how dead simple it is to process movies using the current 64 bit build (with all of it's current limitations).
To paraphrase Gurdjieff, the current V5 64 bit release is no obstacle to getting real work done.
Yes, the current 64 bit build does not allow you to open a quicktime movie file into the source area. And yes, the movie file output options are greyed out. All of this is directly due to apple's total disregard for their digital media content developers, as well as their apparent desire to 'erase the past' rather than provide easy access to it. The quicktime api will never be supported by apple as a 64 bit api, and they made 0 effort to make it easy for their entire existing C++ developer base to move to a new underlying framework, preferring to force everyone to start from scratch and force a different language to do it in to boot. After misleading all of those developers with another 64 bit api they pushed for years and then abandoned as well.
But that is a whole other topic, so let's ignore that in this thread and move on to the 'how do i get useful work done' discussion.
So how do i process movies if i don't have access to movie file io? Great question. The answer is just to work with folders of numbered frame images.
1: Convert your movie file into a folder of numbered frame images.
2: Process the movie in Studio Artist using the the Action : Process with Paint Action Sequence : Image to Image menu command.
3: Convert your folder of processed frame images into a movie file.
And you are done.
Here's a screen shot of the particular Action menu i am referring to in the current 64 bit build.
I've been running extensive tests of all of our V5 embedded bezier keyframe animation functionality so we can understand any existing issues and address them in our V6 development efforts (lack of adequate support for dynamic brush paint effects when targeted as embedded bezier action steps being one very important thing to clean up).
I have not found this 3 step way of working to be any real impediment to my immediate task of trying to get things done. Getting things done meaning processing movies in Studio Artist V5 64 bit build.
The reality is that we have been encouraging our customers who use Studio Artist in production type environments to use this approach for years. Rendering directly to movie files worked great if you never ran into any problems. But if you were processing large numbers of frames, and you did run into some issue, you could end up with a damaged move file, and all of your processing up to that point in that file would be lost. Not that case if you are rendering to a folder of numbered rendered frame images.
So sure, it's fine to complain about the lack of movie file io in the current V5 64 bit build. Believe me, i'm probably more upset about it than you are. But don't tell me you can't get any work done. Because it simply isn't true.
I should qualify any questions i answer about apple these days by pointing out that i suffer from apple developer PTSD. So sorry about the rant if i got too intense, but i do think everything it says is valid.
I know so many 'die hard creative mac users' that work in music, video, 2D digital art, people who were loyal apple customers for a really long time, who find themselves in the exact same situation you find yourself in. We've been going through the same thing here. Should we move to windows computers for our personal day to day work.
And as apple continues to disregard fundamental user interface principals like consistency in how you do things across different apps, or throwing away all of the encoded spatial memory associated with how people setup folders in a graphical file system. You wait in dread for their next stab at screwing up something as basic as the finder.
Right now is a strange time to be buying a new computer, because things are going to change dramatically over the next year or 2. Apple has stated they are transitioning macs from being based on intel hardware to ARM hardware apple custom manufactures. And these new ARM macs will also have what has been ios only extras like the neural net accelerator chip they use for face recognition to unlock iphones and ipads.
At the same time Nvidia is buying ARM. At the very least, Nvidia GPU cores are going to work very well coupled to ARM CPU cores. And people will be moving towards putting lots of CPU and GPU cores on a single chip substrate. Which is great because you want them to be able to share the same memory.
Microsoft has dabbled in RISC Surfaces before. But the pressure of apple moving to RISC for their 'computer' platform as well as Nvidia developing and promoting standardizing common CPU/GPU hardware they own and control will really jump start the 3rd party RISC Windows clone computer market.
So it would appear that the personal computer market is very quickly going to move from it's current intel CPU and under-powered GPUs to a whole new architecture based on ARM RISC chips tightly coupled with GPU engines specifically optimized to run things like neural nets.
We are seriously thinking about adding a ubuntu platform option for Studio Artist, in addition to our mac and windows options. This whole platform shift that is coming along soon in the personal computer market might end up making Ubuntu seem more attractive then it currently does. Or not. We will see how it all shakes out.
But i do think the time is ripe for other solutions besides windows or mac as your operating system. Creative professionals are so fed up with apple at this point that they would jump ship in a second if a really great alternative appeared.
I understand and sympathize with the frustrations regarding Apple's APIs
I was wondering if it would be possible to treat an image sequence as a continuous image source, for more iterative/frame-by-frame workflows that processing with an action sequence.
An image sequence version of 'movie layer' would be wonderful as well.
Finally, one of the things I miss the most in 64-bit mac version is the ability to get live input. Any way to go around that with say Syphon or NDI or some other non-apple way of exchanging frames would be amazing.
Our new cross platform movie and video framework can work with either movie files or folders of images.
I was frankly ready to write off movie layers when apple initially killed off Quicktime. Because AVFramework is really not setup to support how we would want to use it in terms of how movie layers work in Studio Artist.
But now that we have the new framework in our development code, movie layers may very well come back just like they work now. Except it will actually be better because you won't have to worry about which codec you are using for all of the edits. So it's actually way cleaner.
Made me very happy. Because i hate to see developed features like that thrown into the dustbin of history, which is what apple would have us do.
Our magical new cross platform movie and video framework also works with live video input sources. So all of those live video capture features are coming back.
Apple did some rather screwed up things in Catalina (in my opinion as a developer of mac software).
For example. They have all of this new security stuff you need to deal with in your application. Ok, sure, i have no issue with beefing up security.
But rather then just flashing a message to the user asking them if they want the application to use the camera, and then returning an error with no image when you call the routine to grab a video frame if the user had clicked no in that dialog, instead they just hard crash your application when your code tries to access that grab a video frame routine.
So to avoid hard crashing, you now have to specifically add additional metadata to the applications plist associated with what security protocol you want to access (things like the camera or the microphone). And you are required to make a special routine call in your software that requires an entire unnecessary additional Framework library to be added to your project (bloating your code in the process) just to set some flag associated with this. And of course you must make this set a security flag call in objective-c even though you are writing a c++ program, because they can't be bothered to offer developers c++ headers to do anything on the platform anymore.
After you do all of this bullshit, then the os will actually bring up the dialog asking the user if they want the application to use the camera, when you try to access it in your software.
So everything i called 'bullshit' above really is, because there was no need for it. But it's very typical in terms of how they view developers. Rather than doing something once at the source to save the entire developer base the work of all of them doing it themselves, they instead create new and more arcane hoops that everyone must code up to jump through. Often associated with making the platform more locked down and less accessible.
Syphon support. Yeah, maybe. It's mac only i think. So that is an issue for windows people.
It would be nice to implement a cross platform video hose. It's something we've been looking into. Any suggestions?
Thanks for this reply, I'm looking forward to the magical framework -- if you need testers please consider me.
Meanwhile, I think I figured something that works for me: load image sequence as content or source layer. It does what I need, I'd never had to use it before, but it is a great solution for rotoscoping and indeed offers more control than a movie layer. Attaching a still and here is a quick video for reference if anyone else needs to do the same
RE: video in, PC's syphon equivalent is spout https://spout.zeal.co/ but not cross-platform.
For my own live performance work, lately I've been using NDI which is promising and cross-platform https://www.newtek.com/ndi/applications/ . It has standalone API but it also has bridges to syphon/spout for real-time applications.
Sure, working with Context Action Steps in a PASeq is definitely a way to approach it.
I'll check out NDI.
I also ran across something recently that might work, it's an open source thing that someone put together for passing video streams between applications. So we could potentially compile it for the different platforms we support.
Great, thanks, one side benefit of NDI is that there are cheap apps that turn your mobile device into a networked NDI streaming camera - a nice solution for stop motion or live performance.