UnrealLink - Cannot get blend masks to work

I have a simple project setup to test image warping (2D projection mapping). It streams the output from UE4 using Spout and projects the warped image onto a single screen.

It works perfectly in LightAct and behaves as expected. However the nDisplay config and .pfm files created by UnrealLink do not produce the same results when I use these with a standard nDisplay UE4 project. The screen resolution, position etc is fine, but the warping does not happen - it just shows the unwarped output.

I suspect this may well be a problem with nDisplay rather than LightAct, but I would appreciate any assistance, as Unreal’s documentation is typically lacking in detail, and I can’t find any other info online. I am using the nDisplay launcher from UE 4.25.

Here are the files that UnrealLink produces: https://we.tl/t-hHIuWcDvRM



I am afraid you are describing mutually exclusive approaches because any kind of manual warping be it with a Texture Warp and Blend node or inside a Canvas Setup window doesn’t get transferred to nDisplay config file.

However, if you are using a 3D approach to projection mapping where a projector inside LightAct’s visualizer acts as a camera looking at textured 3D objects, then the projector properties can be transfered to nDisplay config files.

This 3D projection mapping approach is further explained in these 2 videos:

Let me know if you have any further questions.

p.s. please note, there will be a major upgrade to this workflow in a few weeks or so.


Thanks Mitja. Okay, that explains why it’s not working.

I will need to use the 3D approach eventually, but was hoping to start with a simpler setup. I will dive into the tutorials now!

Thanks again for your amazing support!

Hi Mitja. I’m afraid I’m still not seeing the warping working.

I tried several tests with my own objects, but then went back to the sample project (UEProjectionMappingStarterPack) and found that this also doesn’t seem to do what I expect.

When I run the launcher and select the packaged executable, and the domeProjectionMapping.cfg file, the program runs and displays the two projector outputs, but the alpha masks do not work - when the light sweeps round I can see it being projected on the floor. There is also no sign that the output is being warped.

Am I missing something?!


Hi Adam,

Could you please send me a zipped folder with your LightAct project, so I can see what you are actually working on.

Also, when you create your .cfg file, are there some .png images in the same folder?

And could you please post a screenshot of your UnrealLink window just before you hit Create nDisplay config file button?

Many thanks!

Hi Mitja,

Here is a link to two projects, along with supporting files: https://we.tl/t-AOKvmSNcQG

The Dome project is one of your samples from the UEprojectionMappingStarterPack. I have included the full LA project, including the assets and nDisplay folder. Also included are screenshots of the UnrealLink window and the nDisplay Launcher window. You can see I am using the packaged version of the matching UE4 project.

The output shows what I am seeing on screen when I run the nDisplay launcher. If the alpha masking and image warping was working I would expect this to look different. Surely there should be no visible projection of light around the dome itself.

I have also included the Mannequin project, which is my first attempt at doing this with the asset we will be using for our client.

For this project, I am just using the a standard UE nDisplay test project (based on their template) as the application in the Launcher. But in theory this shouldn’t make a difference – we could use any UE project that outputs using a UE DisplayClusterRootComponent or DisplayClusterActor, right?

With this project, I noticed that the alpha file produced is just a full-screen version of the texture that is being fed to the model. The dome project does at least produce alpha images, even though they don’t seem to actually work with nDisplay.

The only difference is that I am using a test image to feed to the Texture Setter in this project. Will an alpha mask only be produced if we use an image as an input to the texture setter? Or is this because the mannequin model lacks the relevant UV info?

I really appreciate your help with this. LightAct looks like a great solution - it’s quite straightforward to use, and seems very powerful. However, we need the image warping and masking/blending to work via nDisplay. I hope I am missing something simple!



Hi Adam,

I looked at your domeProjection files and it seems you’ve discovered a regression bug. :slight_smile:

In previous versions of LightAct (before 3.6.3), AutoBlend masks were created with a black background. In 3.6.3, however, they seem to be exported with a transparent background, which doesn’t work as intended in Unreal (it doesn’t really matter in LightAct though).

This image shows how the masks should look in order to be properly interpreted by Unreal.


Luckily, the workaround is very simple, because all we need to do is add a black background behind the white transparency mask coming out of AutoBlend node in Projector’s layout:

Please open projector’s layout and insert Color to Texture node and Texture Add node. Connect them as shown above and enter your projector’s resolution into Color to Texture node.

Then open UnrealLink window and regenerate nDisplay config files. Now both masks should have a black background


As mentioned in my previous replies, no manual warping gets transferred to nDisplay config files in this workflow, so I am not quite sure what you are referring to. The end result of the current 3.6.3 iteration of UnrealLink is when LightAct’s output perfectly matches Unreal Engine’s output as shown below:

Thanks Mitja. This explains why the masks weren’t working.

I have now been able to get the masking to work correctly in the dome projection sample project. I have also updated my mannequin project to create a blend mask in the same way, and the masking now works in this too.

However, are you saying that LightAct doesn’t produce warping info from the model and/or this doesn’t get carried through to nDisplay?

If so, how would I go about doing this? I need the image to be warped so it will project onto the mannequin surface.




I am pretty sure there is some misunderstanding here, but I don’t know where :slight_smile:

Let me ask with a very basic question: are you trying to project a single Unreal Engine viewport onto your mannequin or do you have a 3D model of your Mannequin which, inside Unreal Engine, looks exactly like what you want your real Mannequin to look like?

There is a key difference here and that is the difference between 3D projection mapping workflow (the latter) and 2D projection mapping workflow (the former).

If anything is unclear, please post some screenshots of your Unreal Engine project and how your Mannequin looks like in there.


We need to project a single Unreal Engine viewport onto a physical mannequin.

I had assumed that we would output a flat image from Unreal, and that this would be warped according to the PFM file in the nDisplay config so that the resulting image texture would wrap correctly around the physical model.

LightAct seems to generate the PFM file and alpha mask, and the mask works as expected, but the PFM file is not actually warping the output, which is what I expected it would do.

But we do have a 3D model of the mannequin too that we created from a Lidar scan. I created a PFM file manually from this 3D model (using the Warp Utils plug-in in Unreal), and if I use this file it does produce the warping effect. However Unreal’s documentation is pretty basic when it comes to how PFM files work, so it is hard to get the result we want doing it this way.

Are you saying that we should instead use the same 3D model in UE to display the image (using a standard texture), then output this to the projector, either via LightAct or nDisplay?

I would certainly appreciate any help you can offer.

Ah yes. It seems I’ve misdirected you a bit when I suggested the UnrealLink approach. Sorry about that.

In this case, I would suggest you use the 2D approach. For the moment (until 3.7.0) this would require:

  • streaming the texture from UE to LightAct using Spout
  • manually warping the content so that it fits your projection object
  • if you have several projectors, you would also need to manually blend them.

I hope this helps,

Okay, thanks Mitja. This is disappointing as there don’t seem to be any viable solutions out there, which is surprising. However, I have been very impressed with LightAct so far, and I am really hopeful that you may be able to offer this functionality in a future release.

Are you able to confirm if this is the case, and if so when version 3.7.0 will be released?

Hi Adam,

This feature is planned for 3.7.0 which should be released in a few weeks.


Awesome, thanks Mitja!

Please keep me posted. I’m happy to help test a beta release.