Register a SA Forums Account here!
JOINING THE SA FORUMS WILL REMOVE THIS BIG AD, THE ANNOYING UNDERLINED ADS, AND STUPID INTERSTITIAL ADS!!!

You can: log in, read the tech support FAQ, or request your lost password. This dumb message (and those ads) will appear on every screen until you register! Get rid of this crap by registering your own SA Forums Account and joining roughly 150,000 Goons, for the one-time price of $9.95! We charge money because it costs us $3,400 per month for bandwidth bills alone, and since we don't believe in shoving popup ads to our registered users, we try to make the money back through forum registrations.
«431 »
  • Post
  • Reply
echinopsis
Apr 13, 2004



College Slice

looks amazing

Adbot
ADBOT LOVES YOU

ImplicitAssembler
Jan 24, 2013



heh. that's really neat.

sigma 6
Nov 27, 2004

the mirror would do well to reflect further

That's super cool. Can you give us a breakdown or quick tut on how it was done? Agisoft or reality capture?

500
Apr 7, 2019



sigma 6 posted:

That's super cool. Can you give us a breakdown or quick tut on how it was done? Agisoft or reality capture?

Thanks! I used Agisoft Metashape and Houdini. But you could really use any photogrammetry software that lets you export a point cloud.

I started by finding a nice alley in this video. I used the one around the 9 minute mark. Then I set the video to full screen and used the chrome inspector to remove the UI so I could step through frame by frame and take screenshots.

Next, I generated a dense point cloud from those screenshots in Metashape, exported the points to obj, and then imported the obj into Houdini. The disintegration effect is created by adding a noise attribute to the particles in a point VOP that will flag some for disintegration, and then moving the position of those particles slightly on the y axis every frame with a solver. Last steps are adding a camera and some lights and rendering with Redshift.

ceebee
Feb 12, 2004


500 posted:

Thanks! I used Agisoft Metashape and Houdini. But you could really use any photogrammetry software that lets you export a point cloud.

I started by finding a nice alley in this video. I used the one around the 9 minute mark. Then I set the video to full screen and used the chrome inspector to remove the UI so I could step through frame by frame and take screenshots.

Next, I generated a dense point cloud from those screenshots in Metashape, exported the points to obj, and then imported the obj into Houdini. The disintegration effect is created by adding a noise attribute to the particles in a point VOP that will flag some for disintegration, and then moving the position of those particles slightly on the y axis every frame with a solver. Last steps are adding a camera and some lights and rendering with Redshift.

Awesome breakdown, thanks for the explanation, it came out looking sweet!

cubicle gangster
Jun 26, 2005

magda, make the tea


yeah, thank you! i'm going to try a tyflow version.

Handiklap
Aug 14, 2004

Mmmm no.


Woah that is sick as gently caress. We've done a few quick and dirty gopro-on-a-stick facade captures for, like, cheap reference, but this is just...sublimely novel.

500
Apr 7, 2019



Handiklap posted:

Woah that is sick as gently caress. We've done a few quick and dirty gopro-on-a-stick facade captures for, like, cheap reference, but this is just...sublimely novel.

Thank you! I first started thinking about it a couple of months ago when I saw this video, by Benjamin Bardou: https://vimeo.com/392767896

I wanted to make something similar, but didn't really feel comfortable photographing people in public. I actually spent way too long trying to find sequential photograph series on flickr and stuff. Took me a while to realise I could just use a youtube video of someone walking around.

Listerine
Jan 5, 2005

Exquisite Corpse

Did you manually save out each frame or automate it?

500
Apr 7, 2019



Manual. Windows key + print screen with the video set to full screen, using the > key to step through frames. Only took a few minutes.

BonoMan
Feb 20, 2002


Jade Ear Joe

I think I'm finally going to switch over to Cinema4D from Maya. I've used Maya since 1.0 in the 90s but never really used it enough to feel it in my bones. As my work has taken me into more motion graphics career and compositing I feel like C4D might be the better option for a solor user supplemental program.

Maya is super powerful but for a one man show that uses it to create content to supplement my After Effects work, it can be really burdensome and obtuse sometimes.

I've been watching a lot of the stuff from C4D Live over the past few days and goddamn so much stuff in the C4D interface just makes sense and I like it a lot.

What's the consensus on Redshift? The subscription models have a with or without option but from what I'm reading it might be a while before Redshift is the optimal render for C4D. But, ya know, the internet and all.

Anyway just puttin' out some feelers.

EoinCannon
Aug 29, 2008



Grimey Drawer

BonoMan posted:

I think I'm finally going to switch over to Cinema4D from Maya. I've used Maya since 1.0 in the 90s but never really used it enough to feel it in my bones. As my work has taken me into more motion graphics career and compositing I feel like C4D might be the better option for a solor user supplemental program.

Maya is super powerful but for a one man show that uses it to create content to supplement my After Effects work, it can be really burdensome and obtuse sometimes.

I've been watching a lot of the stuff from C4D Live over the past few days and goddamn so much stuff in the C4D interface just makes sense and I like it a lot.

What's the consensus on Redshift? The subscription models have a with or without option but from what I'm reading it might be a while before Redshift is the optimal render for C4D. But, ya know, the internet and all.

Anyway just puttin' out some feelers.

Redshift is really good, very fast
We run max but lots of the freelancers we work with that do motion graphics-y stuff use C4D/redshift

Listerine
Jan 5, 2005

Exquisite Corpse

When the guys at Entagma ran a poll to see which renderer their viewers wanted more tutorials to cover, Redshift was the top choice by a landslide. It's actively supported and updated fairly frequently, and also cheaper than Octane by a little.

Fragrag
Aug 3, 2007
The Worst Admin Ever bashes You in the head with his banhammer. It is smashed into the body, an unrecognizable mass! You have been struck down.

Listerine posted:

Did you manually save out each frame or automate it?

500 posted:

Manual. Windows key + print screen with the video set to full screen, using the > key to step through frames. Only took a few minutes.

Because I'm a sucker for scripting, I found a quick one-liner command to automate that if you have youtube-dl and ffmpeg installed on your system:

code:
youtube-dl "[YoutubeURL]" -o - | ffmpeg -i pipe: -filter:v fps=25 output_%03d.jpeg

bring back old gbs
Feb 28, 2007

I'm a very strong saiyan. Very strong. Probably the strongest ever.


BonoMan posted:

I think I'm finally going to switch over to Cinema4D from Maya. I've used Maya since 1.0 in the 90s but never really used it enough to feel it in my bones. As my work has taken me into more motion graphics career and compositing I feel like C4D might be the better option for a solor user supplemental program.

Maya is super powerful but for a one man show that uses it to create content to supplement my After Effects work, it can be really burdensome and obtuse sometimes.

I've been watching a lot of the stuff from C4D Live over the past few days and goddamn so much stuff in the C4D interface just makes sense and I like it a lot.

What's the consensus on Redshift? The subscription models have a with or without option but from what I'm reading it might be a while before Redshift is the optimal render for C4D. But, ya know, the internet and all.

Anyway just puttin' out some feelers.

You won't regret it, except sometimes. You'll be creating things that just "look" good in a fraction of the time but in 2020 why not give Blender a shot? Like... c4d is the language I speak, I even use it for modelling poo poo for 3d printing where a dedicated CAD program would benefit me greatly because I just KNOW cinema, and can push it around way easier. But if I were learning something new from scratch today... and looking at all the crazy poo poo blender has as its default package, not... whatever studio/master/ultimate/production bundles maxon is cutting their features up into today.

Obviously if you're pirating it this is all out the window, just get the ultimate version with hair and dynamics and real time GPU rendering, scripting, etc. but if you're going to be paying for it I almost feel like it's prime-time has passed.


EDIT: waitwait wait i'm on the Maxon site and they dont seem to cut c4d up into crippled versions anymore. that's cool, I guess it would have been a pain in the rear end to combine that system AND a subscription model

bring back old gbs fucked around with this message at 10:52 on Apr 24, 2020

ceebee
Feb 12, 2004


Yeah in my opinion Maya is lacking a user-friendliness about it's motion graphics tools. There's MASH which is a pretty powerful system that's included for procedural/instanced type effects but I've seen C4D people on youtube make motion graphics stuff in 1/4th the time it would take me to do it in Maya and I consider myself fairly proficient. I still prefer Maya for all my modelling, UVing (although RizomUV is just about beating this), and game engine stuff.

But yeah Blender is looking hella juicy for pretty much anything these days. I'll probably end up learning it in the next 1-2 years, maybe less.

SubNat
Nov 27, 2008

I wish I was more Moomin-minded...


500 posted:

Manual. Windows key + print screen with the video set to full screen, using the > key to step through frames. Only took a few minutes.

Fragrag posted:

Because I'm a sucker for scripting, I found a quick one-liner command to automate that if you have youtube-dl and ffmpeg installed on your system:

code:
youtube-dl "[YoutubeURL]" -o - | ffmpeg -i pipe: -filter:v fps=25 output_%03d.jpeg

If you download the video, I've used PotPlayer to rip frames, since you can have it just save out every n frames to a directory with no hassle.

500
Apr 7, 2019



Fragrag posted:

Because I'm a sucker for scripting, I found a quick one-liner command to automate that if you have youtube-dl and ffmpeg installed on your system:

code:
youtube-dl "[YoutubeURL]" -o - | ffmpeg -i pipe: -filter:v fps=25 output_%03d.jpeg

Sick! I haven't heard of youtube-dl before. Could the script be edited like this to only record a certain time range? (I haven't done much command line scripting). For example if we want to start 60 seconds in, and go for the next 30 seconds.

code:
youtube-dl "[YoutubeURL]" -ss 60 -t 20 -o - | ffmpeg -i pipe: -filter:v fps=25 output_%03d.jpeg

SubNat posted:

If you download the video, I've used PotPlayer to rip frames, since you can have it just save out every n frames to a directory with no hassle.
I also haven't heard of PotPlayer, so thanks for the tip. I've been trying to avoid large downloads at the moment though, since my internet situation is lovely and I have to work from home. Has anyone seen what they charge for a portable 4g modem in the US? It's bloody outrageous.

Prolonged Priapism
Dec 21, 2007
Holy hookrat Sally smoking crack in the alley!







Proof of concept for a potential project.

cubicle gangster
Jun 26, 2005

magda, make the tea


Pretty sure the Lego star wars movie is under NDA!

Good bumps and scuffs. The Lego moves had a layer of greasy fingerprints too - and I've just realised the one situation where the vray triplaner map 'randomise every frame' checkbox would actually be useful.

cubicle gangster fucked around with this message at 05:48 on Apr 27, 2020

Odddzy
Oct 10, 2007
Once shot a man in Reno.

cubicle gangster posted:

and I've just realised the one situation where the vray triplaner map 'randomise every frame' checkbox would actually be useful.

What use would that be?

BonoMan
Feb 20, 2002


Jade Ear Joe

Odddzy posted:

What use would that be?

Making the smudges move every frame like it was actually being hand animated?

Odddzy
Oct 10, 2007
Once shot a man in Reno.

BonoMan posted:

Making the smudges move every frame like it was actually being hand animated?

aaaah yeah, that would make sense!

500
Apr 7, 2019



Playing around with some volumes in Houdini. Still figuring out how to dial in the samples and stuff in Redshift. It ended out coming out a bit grainy but oh well.

https://i.imgur.com/AsUKSqG.gifv

Odddzy
Oct 10, 2007
Once shot a man in Reno.

drat, that looks good man.

Some Goon
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.



OP's literally over a decade old, so sorry if this isn't the right thread, it looks like the most appropriate one.

Coronabordom has me interested pursing my interest in the artistic possibilities of VR. I don't want to make games or anything like that, just create non-interactive environments for people to inhabit / experience, but there are lots and lots of programs out there for creating VR content and I'm having a hard time picking one to go with. The in-headset tools I've used, while cool, don't offer the level of precision I'm after. What do y'all recommend for this purpose? I'm a complete neophyte at digital art but I've got all the time in the world, so a steep learning curve isn't a turnoff if that's what it takes.

Also, to be clear, I have no interest in this professionally, something about VR just tickles an artistic bone in me that has long been dormant.

Comfy Fleece Sweater
Apr 2, 2013


Some Goon posted:

OP's literally over a decade old, so sorry if this isn't the right thread, it looks like the most appropriate one.

Coronabordom has me interested pursing my interest in the artistic possibilities of VR. I don't want to make games or anything like that, just create non-interactive environments for people to inhabit / experience, but there are lots and lots of programs out there for creating VR content and I'm having a hard time picking one to go with. The in-headset tools I've used, while cool, don't offer the level of precision I'm after. What do y'all recommend for this purpose? I'm a complete neophyte at digital art but I've got all the time in the world, so a steep learning curve isn't a turnoff if that's what it takes.

Also, to be clear, I have no interest in this professionally, something about VR just tickles an artistic bone in me that has long been dormant.

Iíve no experience with this (no vr headset) but I follow a guy on twitter who makes cool stuff with : https://quill.fb.com/

And apparently free

Edit: also I know Modo has some tools for VR, but thatís a heavier app, thereís an indie version

Warbird
May 23, 2012

Burn the 'dawgs
Kill the Yellowjackets
Purge the Tiger
It is better to die for Bama than to live for yourself


Fun Shoe

If stuff like Tiltbrush isnít doing it for you take a look at Blender. Iirc the next major release is getting built in VR support which would let you build and mess with things both in VR and in
your desk proper.

E - here we go https://www.roadtovr.com/blender-vr-support-openxr/amp/ So late May and itís just a viewer for now.

Double e - Thereís also an existing plugin to let you do that so you could get cracking now if youíd want. https://www.marui-plugin.com/blender-xr/

Warbird fucked around with this message at 03:26 on May 6, 2020

SubNat
Nov 27, 2008

I wish I was more Moomin-minded...


Why not Unreal? You wouldn't have to look at programming, and it's quite easy to just get started with vr in scenes.
There's the ability to work in the editor in VR if that tickles your fancy, and you can scale your ambitions as high or low as you'd like, though custom models would be best to make in a 3d program and drop in.

There's a veritable fuckload of free assets for use in it between quixel plus the free epic and epic-subsidised assets as well.

mutata
Mar 1, 2003

You walk in with the Turnips, you leave with the Bells.



A 3d package like Blender feeding assets into a VR-ready middleware engine like Unreal or Unity is what you want.

Source: I do exactly what you described as a job.

Some Goon
Jan 6, 2013

A golden helix streaked skyward from the Helvault. A thunderous explosion shattered the silver monolith and Avacyn emerged, free from her prison at last.



Thanks y'all.

The March Hare
Oct 15, 2006

Je rÍve d'un
Wayne's World 3


Buglord

Similarly to Some Goon I find myself with a bunch of free time.

I've got some old stained glass windows and I was thinking it might be fun to take a stab at recreating them in 3D so I can try to get a feel for what they might look like in different window depths and stuff without shattering them IRL.

I feel like I should be able to work out the modeling, I've got some experience with it from my days studying architecture forever ago, but I never really got into materials or rendering as much.



I assume this is possible, tell me if it isn't, but ideally I'd like to cut out and port over each piece of glass from a reference photo like this one and then I have some kind of map on top of that I guess to act as a light filter if that makes sense. Basically to simulate the effect of light passing through the different paints and bits of grime.

I don't need anyone to do a full writeup or anything, but if anyone can offer a quick summary of what terms I should Google search for this that'd be super helpful.

EoinCannon
Aug 29, 2008



Grimey Drawer

The March Hare posted:

Similarly to Some Goon I find myself with a bunch of free time.

I've got some old stained glass windows and I was thinking it might be fun to take a stab at recreating them in 3D so I can try to get a feel for what they might look like in different window depths and stuff without shattering them IRL.

I feel like I should be able to work out the modeling, I've got some experience with it from my days studying architecture forever ago, but I never really got into materials or rendering as much.



I assume this is possible, tell me if it isn't, but ideally I'd like to cut out and port over each piece of glass from a reference photo like this one and then I have some kind of map on top of that I guess to act as a light filter if that makes sense. Basically to simulate the effect of light passing through the different paints and bits of grime.

I don't need anyone to do a full writeup or anything, but if anyone can offer a quick summary of what terms I should Google search for this that'd be super helpful.

Depending on the refractive shader you're using you should be able to pretty much plug in the colour map into some some sort of absorption or extinction value to get the glass refracting the desired colour. Then use an overall grunge map in some kind of material blend to bring in a black dirt material into desired areas.

cYn
Apr 1, 2008


Finally getting around to Zbrush, once you get passed the weird broken poo poo left over from when it was designed for something else, it's pretty cool. My first attempt in progress.

Is it worth doing my textures in Zbrush? Or should I go to 3dCoat/Substance.

Listerine
Jan 5, 2005

Exquisite Corpse

cYn posted:

Finally getting around to Zbrush, once you get passed the weird broken poo poo left over from when it was designed for something else, it's pretty cool. My first attempt in progress.

Is it worth doing my textures in Zbrush? Or should I go to 3dCoat/Substance.



I'm not a fan of texturing in Zbrush, it relies heavily on Polypaint, which is frustrating, and the obtuse interface that Zbrush is known for really gets clunky when you're trying to figure out how to work between polypaint color, the texture palette, and exporting your work. I much prefer Substance.

500
Apr 7, 2019



First look at Unreal Engine 5
https://www.unrealengine.com/en-US/...unreal-engine-5

I'm confused by this part:

quote:

Nanite†virtualized micropolygon geometry frees artists to create as much geometric detail as the eye can see. Nanite virtualized geometry means that film-quality source art comprising hundreds of millions or billions of polygons can be imported directly into Unreal Engineóanything from ZBrush sculpts to photogrammetry scans to CAD dataóand it just works. Nanite geometry is streamed and scaled in real time so there are no more polygon count budgets, polygon memory budgets, or draw count budgets; there is no need to bake details to normal maps or manually author LODs; and there is no loss in quality.

Wouldn't a system like this result in enormous downloads?

Comfy Fleece Sweater
Apr 2, 2013


500 posted:

First look at Unreal Engine 5
https://www.unrealengine.com/en-US/...unreal-engine-5

I'm confused by this part:


Wouldn't a system like this result in enormous downloads?

This sounds insane and itís what got my attention

Iím just learning about normals and all that crap, maybe Iíll skip it and just throw tons of polys at the problem

THERES NO POLYGON BUDGETS ANYMORE or so they say

SubNat
Nov 27, 2008

I wish I was more Moomin-minded...


500 posted:

First look at Unreal Engine 5
https://www.unrealengine.com/en-US/...unreal-engine-5

I'm confused by this part:


Wouldn't a system like this result in enormous downloads?

I wouldn't at all be surprised if it comes with a very efficient way of storing the models as well.
I imagine they would be a lot bigger than current assets on their own, but at the same time, if they were to make normal maps for assets of that quality level you'd probably be drowning in 4k and 8k normal maps (streamed in through virtual texturing, of course.) so that in itself would have some space savings.
(Same goes for stripping away LODs, that too would clean out some space usage, and memory overhead.)

1 8k Normal map is like 60MB after all. Meanwhile a 12.5 million polygon mesh saved in a FBX file is like ~25MB.
Nevermind that the final mesh would likely also have multiple LOD steps, and possibly be partially duplicated due to having lower LODs be proxy-meshed together in their HLOD system as well.

The same goes for Lumin, (Aka what I assume is hybrid/cached RT GI) which could also save a lot of 'download size' in that a lot of spaces that would normally be lightmapped to get good GI no longer needs to be.
With the bonus that it's suddenly supporting dynamic scenes.

It seems a lot like they're utilizing the fact that they can finally work with platforms where you're expected and required to have mesh shader capability, and can thus leverage it into insanely efficient delivery of high poly assets.
Which means that a lot of workarounds for drawcall limitations, as well as size and poly budgets, can now be waived.

Honestly I just hope this means they can finally kill off prerendered cutscenes for good, which would probably be a huge net benefit to game size.

500
Apr 7, 2019



SubNat posted:

Meanwhile a 12.5 million polygon mesh saved in a FBX file is like ~25MB.

How did you arrive at this number? I just saved out a mesh with ~12 million tris and it came out to about 450MB. I feel like I'm missing something.

Adbot
ADBOT LOVES YOU

ceebee
Feb 12, 2004


FBX size is also dependent on stuff like vertex color, UVs, smoothing groups, skinned vertices, stored maps, etc. I definitely have huge FBX files that come out of ZBrush that contain a lot of vertex color info and stuff (12 mil is definitely a 400-700MB file for me). They definitely must be developing some efficient method for storing and reading/writing that data. Curious to see what it is

I really hate hate hate that FBX is handled differently in almost all applications though. I hope Epic tries to standardize it with an open source format. FBX is hot garbage, but at least it's better than OBJ.

  • 1
  • 2
  • 3
  • 4
  • 5
  • Post
  • Reply
«431 »