Forum Replies Created
-
Hi Adam,
This is great! I’m trying to take this concept to the next level with something along the lines of this. Where you could offset an animation of the emitter over each clone.
Do you have any ideas? I can’t seem to get any of the effectors to work the way I want. The closest I have come was the formula effector, but I could only get that to animate the position of the emitter instead of emitter properties.
Any ideas would be most appreciated. Thanks!
-Brian -
This thread is obviously not going any further, but since I just discovered it….
[Jason Byfield] “Just chiming in here in the hopes that BM will come across it and realize that more than just broadcast professionals use their equipment.”
I second that. I work almost exclusively with animation. All of which is output at a straight 30fps. All of the content we create lives online, never broadcast. I still have the same need for a quality output for internal/client reviews. Our clients generally don’t have the budget (or the need) for us to work in 1080p.
-
Okay, I wish I would have found this first.
I took this and placed it in, and this gets me what I need for constraining the emitter to the comp size perfectly! Now how might I constrain the distance that it travels between “segMin” and “segMax” ?
Thanks! (and sorry again for not finding this sooner.)
segMin = .2; //minimum segment duration
segMax = .6; //maximum segment duration
minVal = [0.1*thisComp.width, 0.1*thisComp.height];
maxVal = [0.9*thisComp.width, 0.9*thisComp.height];end = 0;
j = 0;
while ( time >= end){
j += 1;
seedRandom(j,true);
start = end;
end += random(segMin,segMax);
}
endVal = random(minVal,maxVal);
seedRandom(j-1,true);
dummy=random(); //this is a throw-away value
startVal = random(minVal,maxVal);
ease(time,start,end,startVal,endVal)
-
Correction…. I just opened started playing with it and realized this just shaved probably a whole day of work off. This is EXACTLY what I was looking for. You just saved my bacon!
Thanks a million!
-
Brian-
Thanks for this! I’ve never heard of this, but it’s great. I made something similar to this yesterday with Xpresso, but the time it took to replicate everything was not awesome. Since this is a short turn around, this is probably the solution I’ll use.
Thanks for your help!
-BS -
Randy- I was afraid of that. I was sitting at home last night thinking about it, I started to realize that the solution was probably in Python. Either that or processing… something I have no idea how to do.
Thanks!
-
Well I ended up getting it taken care of. I had to set it to render while I left town for a few days on vacation, but it was done by the time I got back. Simply excluding the grass and plants from the GI seemed to help the most.
I have found that rendering out 72dpi vs 300dpi isn’t that big of a difference. You are correct that it is usually reserved from print. However the work I have done as of late is all for web, and it seems to be faster to render out stills for different elements at 300 dpi at 1920×1080 so we have the maximum scaling possible. Ultimately before it’s put in a site it will be optimized for web through photoshop.
As for the project I was working on at the time of this post, I was still a noob and wanted everything as hi-res as possible. (There was no practical reason for the 300 dpi on this image sequence.)
Don’t know if that helps.
-
Through my experiments, I have found the 2d video option is going to work the best. I ran some tests with some ink injected into water and it seems to be the fastest and best option for my application. Thanks for your help! I really appreciate it.
-
Well it sounds like VRay may be the way to go. Thanks for your help! One last question, can render with VRay utilize the NET Render service? Or would I have to have it installed on all the machines that I have NET Render on?
Thanks again!
-Brian -
That was the issue. Man, do I feel like an ass. Oh well, at least it’s working. Thanks again for your patience and help!