I've not been happy about the lack of support from Honk since the v3 update. Things are a little wonky and the app seems to pick up games sounds as my voice from time to time. Discord doesn't do this so I'm fairly sure it's the app.
The other day I was watching a streamer using Veadotube mini. While it doesn't lip sync like Honk, it seemed like a reasonable alternative as it did offer a few other things Honk did not.
So I started drawing up a plan with expressions and what not. Told my husband my idea and he's like - why not just do that Live2D thing you keep talking about.
F*** you for making sense.
So yesterday I watched some quick videos on how it all works and how to do the art. Last night I started working on the art. I'm sure there are some changes I'll need to make (the arms, strings on the hoodie, neck, possibly cheek fluff) but it's a good start.
Now to find the motivation to bring it all in and start trying to rig it. My one brain cell is screaming about learning a new program that has 'parameters'. My brain cell goes immediately into flight when it recognizes anything that is even slightly related to programming. Thankfully if I get stuck with anything 'programming' related, I'm married to someone who plays with that shit all day long.
So yeah. Let's see if I can make this happen.
I've always wanted my character to be able to follow my expressions because there are times I stare right at my screen and just blink, blink, blink, blink at some of the things that get said in chat. XD
Anyways, wanted to write about it and put it into the world so others know and now I need to follow through with it.
