I then merged all of my pieces that made up the structure together and fractured them. I feel comfortable with the fracturing process, and the point of this file was to get an understanding of how I was going to pull off the gravity beam effect, so I created very simple voronoi fractures for the building by using an isooffset to create a volume around the geometry, then scattering points to the volume. I look forward to creating more complicated and physically accurate fractures for the final product.
I created my dopnet where my simulation would take place and brought in my fractured building as a packed RBD. My next step was to create some constraints to act as the nails holding the building together. I used a "Connect Adjacent Pieces" node to create my constraints. I kept my search radius quite low to create pieces of wall that would be held together as if it were nailed.
After looking at references and researching constraints I decided to go with soft constraints. Below is a reference I was looking at, as well as soft constraints versus standard glue constraints. I liked the behavior of the soft constraints far more, as I felt it was giving the effect of nails fighting to keep the structure together.
The stiffness value will come in handy when designing the behavior of the destruction. It seems to equate to how strong the constraints are between the pieces.
Project Research and Development - Beam Effect
I began creating the beam effect by making a simple cone animated to pass over the building.
The idea would be that all of the fractured pieces in the simulation would start as inactive, and as the beam passed over the structure, the pieces would be added to a group and then the group would become active, essentially "switching on" the simulation for the pieces in the group. I altered user 3dsmaya's tutorial (can be found here https://www.youtube.com/watch?v=_51xtpdQKU8) slightly. I first made sure to turn my fractured house from an active object to a static object, that way the simulation won't run and the house remains still. Then inside of a SOP Solver I created a Group node and changed the bounding type to "Bounding Volume." I then brought in my beam object, and used an isooffset to create a volume that I could reference for the group node.
You can see in the gif, as the beam passes over the structure, the green pieces are added and removed from the active group.
Finally, using an attribute wrangle, I was able to activate only the pieces that were inside of the group.
The next part was to get the pieces to be attracted up towards the UFO. I created a sphere that would act as the attraction point for the simulation, and used a "pop attract" node to attract the pieces, once activated, towards the sphere.
This resulted in a behavior a little more hectic than I expected. The pieces were accelerating too quickly, and orbiting the sphere too much. I altered the speed of the pieces with a "speed limit" node. Changing the maximum speed to a value of 10 provided a much calmer behavior, and something I was more looking for.
Modeling - House
I based the house model off of Sears Modern Home No. 264B110. The house was offered in a catalog from 1916, so it has that vintage old farmhouse feel that I am aiming for. Luckily the catalog page comes with a floorplan / layout, so I was able to lay out the walls very accurately.
The house is built in separate components based on material, so when they are fractured, each component fractures based on the material it is built out of.
This is where the house currently sits as of 9/12/2021. It still needs window and door trims, soffits, interior walls, and a set of stairs leading up to the porch.
House Update 1:
The house model is beginning to near completion. Here is a turntable to again illustrate the layers that go into the model.
Project Research and Development - Grass
If the UFO appears overhead of a building, it is going to be generating some sort of exhaust over the area. The grass around the building is going to have to react to it.
I started by creating a box and using Houdini's HairGen to scatter hair across the box's surface
Using Guide Processes I was able to add some frizz, bend, and randomize the length of the blades of grass to give a more natural look to the grass
The process was fairly straight forward. I then converted the hair to vellum so it could be simulated. I added pop fans to simulate the exhaust pushing downwards on the grass. I ended up using two pop fans with the second fan having a wider cone, but weaker force to give a sort of falloff to the look. The falloff is still too harsh and overall the simulation is pretty expensive, so I'm not sure if it is the best way to go about the effect.
Grass Update 1:
The vellum simulation was too expensive for an entire field of grass. When going back to the reference images, a few of them had spottier areas of grass with mostly a dirt landscape. My partner on this project, Elyse, recommended we use those images as the reference for our terrain. This would cut down on the amount of grass needed to simulate, and the rest can be replaced with dusty pyro sims.
I started by creating a grid. To create the spotty texture, I used a procedural noise to manipulate the color attribute of the grid, so get areas of white and areas of black. When scattering points, I can tell Houdini to scatter them based on this color attribute, so we only get points where the areas of white are.
To combat high poly counts even further, I created a camera frustum, a box that would be parented to the camera and act as the viewing field of the camera. By creating this, I am able to delete any points outside of the camera's view, greatly decreasing sim times and allowing just a general faster workflow.
I then simply copied lines to the points to create the grass. By bending the line slightly and adjusting the points attributes such as the pscale, normal and orientation, you can get interesting and realistic looking grass.
First draft of a storyboard.
I managed to work out the pipeline to get the film rendered. I aimed to use Maya Arnold because I wanted to utilize render layers which I could not do with Mantra on the Renderfarm.
My biggest challenge was getting simulated packed geometry to maintain its UV's when transferred to Maya. I started by grouping the fractured pieces into groups based on the UV. For example, the framing for the house all uses the same UV, so I grouped them together.
After the simulation is ran and brought back into the network with a dopimport, I unpacked the pieces. The sim is using packed geometry to run faster, but Maya wont recognize packed geometry. So I used an unpack node which took my 6,000 points and turned them back into a full mesh.
After the geometry is unpacked, the next step was to promote the UV attribute from Vertex, where it is stored in Houdini, to the Point level where Maya can find them. I then deleted all of the other geometry, keeping the original group, and exported it as an alembic.
If the entire mesh were to be exported as one, all of the meshes UV's would be combined as one, as well as the mesh coming into Maya as one combined mesh. This would make assigning materials a very big pain, so I opted to export each component of the house as a separate object. It's a little bit of extra work once in Maya, but I overall think it is much cleaner.
Here is a result of the pipeline test. The simulation was done in Houdini, exported to Maya where materials were assigned, then rendered on the renderfarm.
Another issue when going from Houdini to Maya was dealing with any pyro coming out of Houdini. My options were to come up with a pipeline solution to go from Houdini -> Maya and make sure the pyro is render-able on the farm, or try to match the lighting from Maya to Houdini. I tried matching the lighting in Houdini, but was not pleased with the results. I found out that Maya's Arnold Volumes would allow you to point to a VDB file, and also be able to use a sequence of VDB files.
That would work perfect for pyro. I converted the pyro simulation to a VDB (after deleting unnecessary parameters and cleaning the sim so it was as light as possible) and cached it out, utilizing Houdini's padzero function so the filenames were appropriately named for Maya.
After I uploaded to the farm I noticed that the farm was looking for an absolute path to the VDB files, and erroring out. For some reason, Maya will let you use relative paths for just about everything else (scenes, textures, alembics, etc) but not files for Arnold Volumes.
This was a big letdown, but I also noticed that when the farm was opening my scene from Maya, it was opening it from a folder called root/farmhome/. So I was curious if I could force the farm to use my files by giving it a similar path.
By forcing the scene to look for the VDB's in the root/farmhome/ folder rather than my machine, the farm was happy and everything was rendered properly. An overly simple solution that took a day or two longer than needed to realize..
Updated Animatic 10 - 17
Updated Animatic 10 - 25
Updated Animatic 10 - 31
Updated Animatic 11 - 9