Is there any change of getting gestures incorporated into the next version of SA.
I use a tablet as an input device on my Mac, and being able to just magnify an image using a pinch gesture, instead of using the SA magnify tool, would be awesome.
It's definitely something we'll be looking at more in the future.
I have to be honest, i find myself making accidental gestures on an ipad (or an apple track pad mouse or powerbook track pad) as much or more than using them on purpose, so i worry a little bit about how to address that. You want the ability to perform gestures to expand the control of the software, not cripple it.
Interesting. I just purchased Studio Artist. I love it!
I use the Apple touchpad on my Mac, and I just provided this same feedback to Synthetik support yesterday.
As an iPad user, this is a common gesture as well.
Also, I have been trying out Sidecar with Studio Artist and the Apple Pencil. Sort of a poor man/rich man’s Wacom. “Poor man” because I do not want to buy a Wacom. “Rich man” because an iPad Pro and a pencil cost more than Wacom! LOL!
It works pretty well, and I intend to write up a post for the forum.
I bring this up, because I find myself wanting to use iOS touch gestures with Sidecar and Studio Artist.
Cool. Definitely write a forum post on your experiences using Studio Artist with Sidebar and an iPad. It's on my long list of things to get around to doing, but i'm so busy i have not had time to mess with it. Since i'm jumping into full on V5.5.3 development starting tomorrow it's probably not going to happen for awhile. So hearing about your experiences will be great feedback for adding in features we need to drop in to support it better. And it will be great as an education resource for other Studio Artist users.
So a different take on this whole topic is that we have access to a whole library for creating and then using custom gestures (in addition to just being able to respond to standard user gestures). So i'll throw that topic out for people to think about and give their feedback about how they would use it if they could do it.
I have my own weird ideas about how we could take advantage of something like this. But i'm really curious to hear what it would mean to other people, since they might be envisioning coming at it from a whole different angle than i am.
As far as standard gesture stuff, we'll add pinch to our to do list. That's a good starting point. We're at the point where we can start looking at things like this as we move forward. Certainly experimenting with them internally here.
I will think about other gestures. Since I am very new to the application, I want to get to know it better. A few random thoughts after maybe only 4-6 hours of getting to know the app.
It would be cool to jog through your history with a drag or “stir” gesture. Drag would only accommodate a fixed number of steps unless you tried to implement a “tiered” slide…x-axis would be your timeline, and your y-axis position could change from rough to fine resolution as you drag
A stir gesture would be like rotary knob without stops that spins forever. (Adobe has a stir gesture in their Spark Post app for endless exploring suggested generated text styles).
Touch gestures can be a rabbit hole of course, because they are inherently obscured from the user without visual ux elements to indicate they exist. If you don’t know pinching zooms in and out, you have to be taught. Adobe resolved this by providing a circle to trace to indicate a stir gesture.
BTW, my guess is that there are certain parameters that would benefit from dragging…for example, adjusting brush size, or cycling through colors in gradients or palletes.
But, let me do the application the service of getting to know it better before I suggest too much gesture-wise.
BTW, I just this week discovered SA, and you’re pretty much one of my heroes already. I come more from an electronic music background and my day job has been tech here in the Bay Area. Though recently my background has been in search/information retrieval, I have lot of background in computer graphics and computers as creative tools…friends from the old macromedia days, and currently Apple and Adobe are both clients. And, I friggin’ love Hawaii! You are doing an awesome thing having found your way building your life around this creative vision. Very cool.
Eager to see how SA continues to develop, evolve and change, especially with advancements in computing power, machine learning, user interfaces, and applications. Who could have imagined a machine learning framework like Core ML on a handheld device? And, “use cases” are changing as markets evolve and new generations of artists emerge.
Exciting to see such a personal vision so well implemented, and made so accessible and relevant to a wider user base. I am going to have a good weekend benefiting from the fruit of your years of labor, thank you!
We're a very hot key oriented software program. So you hold down the b hot key, then mouse down (or stylus down) and move the cursor if you want to adjust the brush size. You hold down the h hotkey if you want to use the hand cursor to move the canvas scrolling. And on and on.
BTW, I like that two-finger drag on the TouchPad pans the canvas like the H hot key.
Man, you’re not kidding about hot keys! I searched the manual for the phrase “hot key,” and there are a lot!
To be honest, we're more focused on PyTorch code development for working with deep learning neural nets, since it is directly tied to all AI neural net research at this point in time, and is the code platform with the greatest momentum in this field.
Unlike CoreML, which is very strange, totally proprietary, not tied at all to the greater AI research community. So the only way we would probably ever use CoreML is maybe through something like an ONNX model, although even then, i'm not sure why we would want to use CoreML.
We have a GPU abstraction layer for Studio Artist V6 that sits on top of platform GPU apis. So if we were running neural net code directly in Studio Artist, and we wanted to run it on a GPU, we'd most likely target that.
Plus the whole point of PyTorch is that you can convert it into C++ code through TorchScript if you want to.
So there's my 2 cents on that topic at this point in time. I am not impressed at all with CoreML. I am very impressed with PyTorch.
Oh, totally. I wasn’t proposing Core ML for Studio Artist. it is pretty much just regression and classification, and is one of those Apple-y things written not just for their own use cases, but also to conform to their philosophy of closed computing. It seems like training/modeling can be problematic and I was really just making a Moore’s law-ish observation.
The GPU stuff you mention pretty wouldn’t fly with much Core ML if I understand it properly. They’re not even in the same universe.
Once you start playing with Studio Artist V5.5 a little bit, you'll start to understand where we are heading. Studio Artist V5 is fun, but Studio Artist V5.5 is a whole other thing.
So since you have an electronic music background, we're very influenced by some of the new features in the new korg wavestation design (the gutter concept they built into it for routing), so we're working on something similar to drop into Studio Artist to help manage modulation routing better as we move forward.
Right now we have this 'injection modulation' concept going with the DualMode Paint with the Vectorizer Dual Op. We'll cover the other DualOp options in there as time goes on with the same kind of thing.
the issue is that as the number of modulatable parameters increases in the program, manually editing them becomes a burden. So injection modulation lets you override the current parameter settings when you want to modulate an effect from somewhere else (like the DualMode Paint modulating the Vectorizer using injection override).
W're extending that with this new modulation gutter approach. So you can route signals into a modulation gutter, and then use the gutter as the modulator in the preset effect. That was you can route anything in there.
We already kind of have that for pen pressure, til, tilt orientation. But the idea is to make it a general thing, rather than a specific thing for pen override hidden inside of the Miscellaneous control panel in the paint synthesizer.
So there's my synth geek out for the day. Hope it made sense.
Wow, that is awesome. Yeah, by decoupling the lanes there are a lot more possibilities.
it’s sort of like fm/phase modulation synthesis? Can you rearrange the modulation algorithms?
I made many tests with iPad Pro and Pen years ago, but now I'm using a Wacom Cintiq Pro 32", and I must admit that is the best possibility for painting gestures. Sure that it has a price....
I believe you.
I’m curious about the Huion.