I have UE 4.27.2 version with LightAct 4.0 Plugin enabled. Downloaded the sample project but the problem is I cannot see the LightAct Icon in the Editor. I tried both with Virtual Production Example, and then the LightAct provided Unreal Project…
Light Act plugin is enabled but I still cannot see the Icon. In the video tutorial I can see it being visible there.
Thanks Mitja,
So from now on I have to manually import the objects I want to project on, and place them in the correct location? Previously this was semi-automated right?
One more thing I do export the meshes I want to project on from Unreal Engine so they get the correct positions. One thing we have though and I think this could be a problem. We have 3 separate meshes (screens that we will be projecting onto) but what I have is they have a specific UV mapping (In Unreal Engine we have a Scene Capture that produces a TextureTarget that is then mapped to the screens)
When I export these meshes from Unreal I guess they have that UV Mapping. I am wondering could that affect the way the Thrower works? Should for LightAct these meshes have a different UV Mapping? or it should not matter?
I can see the thrower working correctly for the Sphere, but I cannot see the projection to appear on the flat panels.
The main questions that determine which UE workflow you’ll want to use are these:
are the projection objects present in UE?
If yes, then is the content applied to these objects in UE as well?
If the answer to both questions is yes, then you’ll probably want to use Projector2UCam workflow. This is what we used in Tampa Water Street, for example.
In this workflow, you don’t use any throwers and the content doesn’t have to be applied to the objects in LightAct. The objects in LightAct are only for the calibration of projectors.
I have exported the geometries from Unreal and imported them to LightAct so yes.
These objects are textured inside unreal, and I would like to project it to the objects in LightAct so I guess yes. So far I started to try to do this with Thrower, but so far I have not made much progress in getting something out of Unreal I guess I am doing something wrong… I guess I am missing some basics
I will let you know if I manage to get it running…
In projector2UCam workflow you won’t actually see the content on the projection objects in LightAct’s viewport. What this workflow does is it basically creates nDisplay cameras which are a perfect match with projectors in LightAct (hence the name Projector to Unreal Cam). Then these nDisplay cameras get shared to LightAct which more or less directly outputs them to projectors.
That being said, one workflow I would try is this. Instead of applying the texture from Unreal through SceneCapture to the projection objects, why don’t you share it through TextureShare with LightAct? Then you could render it to a canvas and from then to objects.
If 3D objects in LightAct get their content directly from Canvas, the content mapping takes into consideration the UV map of the object.
Thrower2UCam
As a side note, for Thrower2UCam workflow it doesn’t really matter exactly how the UV mapping is done as long as no face on the objects doesn’t have the same UV coordinates. The content gets mapped to the 3D objects as if the thrower would project the content onto it.
Hey,
So with Project2UCam we cannot do the projection study? I think at this point we need that until we move to the actual test venue.
The Scene Capture is basically so we can map the content to the objects and preview the show in VR. Theoretically we could Texture Share of that Scene Capture output texture and use the “offseted” UV map on these 3 panels inside LightAct (they basically share the same texture just each one is using different segment of the image).
So to sum up, if I would like to previs the show we would have to use Thrower but for the actual show you would recommend projector2UCam?
But from what I see Thrower does not have the Setup (calibrate projection) so I would still need to add a projector in. The thrower is just for previs or mapping the content onto a model in LightAct?
There is no reason why you wouldn’t be able to use projection study in any of these workflows. The only requirement for projection study is to have some projectors and some objects in your viewport and that those objects are set as sources of those projectors.
The thrower is just one way to map content onto objects or video screens. The others being:
using a canvas (in which case UV mapping is very important),
using Set Texture nodes (in which case UV mapping is also very important).
Throwers don’t have a counterpart in the physical world (like projectors do, for example). If the way a thrower maps the content onto your objects is what you want, then you can/should use it. If not, then it’s not the right way. I honestly don’t know. Either way, you still need to use projectors, which would be entirely independent of your thrower.
If you use SceneCapture in UE to map content in objects in UE and if this approach is giving you the right content mapping in UE, then I would suggest that you share the SceneCapture’s texture using TextureShare and map it to your 3D objects through a canvas.
However, if when you map the texture coming from your SceneCapture to your objects in UE in such a way as to simulate a projection on those objects (as if that texture was projected onto your objects), then a thrower might actually be a good approach. A thrower acts as a ‘virtual projector’ that projects the content onto your objects inside LightAct only. Projectors in LightAct then ‘pull’ the content from the objects they act as a virtual camera.