This is the discussion forum for Helix Toolkit.
For bugs and new features, use the issue tracker located at GitHub.
Also try the chat room!

Loaded objects flip upside down when zooming

Anonymous 10 years ago 0
This discussion was imported from CodePlex

Elementu wrote at 2012-02-22 15:53:



I have several helix controls in my application. Some of them are froze to give the feeling of a 2D display than cannot be rotated.

The one that is not froze uses a perspective camera and the other 2D use Orthographic Camera.

The problem that I'm facing is that when I zoom into the 2D controls my objects flip upside down. this can be seen also on the view cube the view changes, however even if I bring the view(guiding myself after the view cube) as it was before the objects are still upside down.

However if I reset the camera the objects will be in the correct position.

Is there a way to prevent this behavior and stop it from happening?


Thank you in advance.

elementu wrote at 2012-02-23 11:07:


What I discovered in my investigation is that in the CameraContoller.cs(HelixToolkit.Wpf.CameraController) in the OnTimeStep method at the point were the this.ZoomHandler.Zoom method is called, when the "this.zoomSpeed * time" parameter has a large value this bug will occur. In fact this bug occurred especially if I ran my application a slower machine. If I ran it on my main machine the bug only occurred if I added a thread.sleep of 500 or 1000 before calling the Zoom method.

So from the Zoom method the delta parameter is passed further in the ChangeCameraWidth method because my camera is an OrthographicCamera one.

Finally in the ChangeCameraWidth method to prevent the delta parameter from having a value that is too large  if the absolute value of the parameter is larger that 1 ( Math.Abs(delta) >= 1 ) then delta will take the value "delta = delta % 1" .

Having done this even if I add the delay or run the application on a slower machine the bug is not reproduced anymore.

I noticed that in the ChangeCameraPosition method that is called from the Zoom method in case my camera is a PerspectiveCamera not an OrthographicCamera there is a section of code with similar functionality to limit the value of the delta parameter.

if (delta < -0.5)
      delta = -0.5;

On a first look the behavior seems the same as before and everything looks all right with the change, except the bug that does not occur anymore.

What I want to ask you is what could be the implications of the change(the limitation of the delta value) I made, what is the case or were could it have a negative effect on the behavior or performance of the HelixToolkit ?


Thank you.

objo wrote at 2012-03-10 15:41:

thanks for debugging this! I added the clamping on the delta value as you suggested (change set c213bd42d8f7), and this seems to correct the problem. It limits the maximum zoom-out speed, but 50% zoom out per zoom event should be sufficient, I think.

stfx wrote at 2012-05-21 00:05:

Nice fix - btw this also occured on a perspective camera but yea fixed after that commit.

Another problem related to zooming is that if you zoom in too far (as far as you can) the camera will start to wobble at the end).

Fixed it by modifying the CameraHelfer.LookAt function like this:


        public static void LookAt(
            PerspectiveCamera camera, Point3D target, Vector3D newLookDirection, double animationTime)
            Point3D newPosition = target - newLookDirection;

            // prevent zooming in until camera gets wobbly due to precision loss
            if (Math.Abs(newPosition.Z) > 0.01)
                AnimateTo(camera, newPosition, newLookDirection, camera.UpDirection, animationTime);
EDIT: This fix is definately not correct

How to retrieve info colors / brush from material class

Miller 8 years ago 0


i've some trouble in getting color information ModelUIElement3D.

The question is understand which is the proper color when the user select something in the drawing.

From ModelUIElement3D you can retrieve GEOMETRYMODEL3D and MATERIAL.

But i'm stuck because the Visualstudio doesn't allow me to compare:

" If model.Material = Materials.Blue Then

End If"

The error is : "=" equal isn't allowed for material class.

I've tryied also to set a name to material: "model.Material.SetName("Blue")"

It give me an error saying that name material property is readonly.

Any other advice???




Problem with material when adding new child to the group

Anonymous 10 years ago 0
This discussion was imported from CodePlex

Tapir wrote at 2012-11-08 08:13:

I have a problem with adding new child to group. Regardless of which material i use, all figures are filled with color of the last added one. Any idea how to solve this problem?

        static private Model3DGroup DrawLayers(int noOfLayers)
            Material materialFiber = new DiffuseMaterial(new SolidColorBrush(Colors.Lime));
            Material materialCopper = new DiffuseMaterial(new SolidColorBrush(Colors.Orange));
            HelixToolkit.Wpf.MeshBuilder builder = new HelixToolkit.Wpf.MeshBuilder();
            Model3DGroup group = new Model3DGroup();

            for (int i = 0; i < noOfLayers; i++)
                if (i == 0)
                    builder.AddBox(new Point3D(0, 0, 0), 15, 15, 1);
                    builder.AddBox(new Point3D(0, 0, 2*i), 15, 15, 1);

                MeshGeometry3D layer = builder.ToMesh();

                if (i % 2 == 0)
                    GeometryModel3D model = new GeometryModel3D() { Geometry = layer, Material = materialFiber, BackMaterial = materialFiber };
                    GeometryModel3D model = new GeometryModel3D() { Geometry = layer, Material = materialCopper, BackMaterial = materialCopper };
            return group;

objo wrote at 2012-11-08 08:25:


var builder = new HelixToolkit.Wpf.MeshBuilder();

inside your loop?

otherwise you are appending more and more boxes to your geometry...

Tapir wrote at 2012-11-08 08:31:

Works perfectly. Thanks a lot!


SharpDX - Transparency from image

Anonymous 10 years ago updated by Nathan Cornelius 2 months ago 337
This discussion was imported from CodePlex

geoarsal wrote at 2014-03-07 10:05:


how can i enable transparency from image for textures ??

In WPF3D, all i need to do is to create the diffuse material with image brush and use it. In SharpDX fork, I had created PhongMaterial and set its DiffuseMap to the same image i used in WPF3D, but here the transparent region are rendered as Black.


Rogad wrote at 2014-03-08 16:05:

I used PNG textures. Also read this about how you need to order transparent parts :

I still have problems with layered transparency though, when one part of the model overlaps the other it does not look right.

geoarsal wrote at 2014-03-08 17:48:

Thank you Rogad for your reply.

I would like to know have you used PNG textures to attained transparency in "SharpDX fork of Helix Toolkit" ?? because the link you have provided is about transparency in WPF3D but I need transparency in SharpDX.

P.S. I already got the transparency using image texture in WPF3D but I require same functionality in SharpDX fork using that same image.

Rogad wrote at 2014-03-08 17:56:

Sorry I did not realise you were asking about SharpDX. I have not tried it, because I couldn't see how to get support, so gave up on it.

geoarsal wrote at 2014-03-12 19:04:

After debugging the code, I had resolved this issue. Following are the changes I needed to make in Helix SharpDx.

In the pixel shader "PShaderPhong" method, alpha value is taken explicitly from the diffuse material.

Current Code "Default.fx"
/// set diffuse alpha
I.a = vMaterialDiffuse.a;
Fix "Default.fx"
I.a = vMaterialDiffuse.a * vMaterialTexture[3];
Also the conversion of texture map is done through BMP encoder, however apparantly it looks like BMP encoder doesn't support alpha in texture map. Changing bmp encoder to png encoder will resolve this issue.

Current Code "RenderUtil.cs"
public static byte[] ToByteArray(this System.Windows.Media.Imaging.BitmapSource bitmapSource)
            using (MemoryStream ms = new MemoryStream())
                var encoder = new System.Windows.Media.Imaging.BmpBitmapEncoder();
                return ms.ToArray();
Fix "RenderUtil.cs"
public static byte[] ToByteArray(this System.Windows.Media.Imaging.BitmapSource bitmapSource)
            using (MemoryStream ms = new MemoryStream())
                var encoder = new System.Windows.Media.Imaging.PngBitmapEncoder();
                return ms.ToArray();

objo wrote at 2014-04-29 10:57:

I have forwarded this to the guys working on the sharpdx fork.

Rogad wrote at 2014-05-24 18:16:

Is there a similar fix for Helix ?

For my project this is the main area where Helix does not seem to work right.

Despite following various tutorials/guides on transparency, such as putting the transparent textures last I still end up with this :

Some of the transparency is working, but as you can see not all.

Would really like to fix this :)

Rogad wrote at 2014-05-30 16:46:

Anyone ? Objo ? Are you still with us ? :)

objo wrote at 2014-05-30 22:56:

hi Rogad, yes I am following the discussion, but don´t know how to solve this. How are you rendering the hair? Are you using SharpDX or WPF? Can you create a demo application we can include with the library?

Rogad wrote at 2014-05-30 23:08:

Hi :)

I'm using WPF and Helix. I have not used SharpDX for anything yet, I was trying to find some documentation for it or support, but haven't found much yet.

Unfortunately I cannot turn this into something distributable. The models I had to buy from Daz3D and I cannot redistribute them like that.
Under review

Need Advice On How To Sync Coordinate Systems

Anonymous 10 years ago updated by ไอยดา สุรีวงค์ 3 years ago 2
This discussion was imported from CodePlex

BogusException wrote at 2014-07-06 20:41:


I'm still getting using the FlightsDemo to create a proof for a project I want to do. The objective is a rotating globe with the flights on them, just like they are in the demo files included with the product today.

I am running into an issue/situation whereby I can't figure out how to translate the mouse click to the proper coordinates on the Sphere... The biggest demonstration of this issue is this:

-As the Sphere is rotating:
            <EventTrigger RoutedEvent="Canvas.Loaded">
                        <DoubleAnimation x:Name="Rotate360"  From="0" To="360" Duration="0:2:00" AutoReverse="False" 
                                         RepeatBehavior="Forever" Storyboard.TargetName="rotation" 
                                         Storyboard.TargetProperty="Angle" />
<t:SphereVisual3D x:Name="TheEarthSphere" 
                              Material="{StaticResource EarthJPG}"
                              PhiDiv="25" >
                                <AxisAngleRotation3D Axis="0,0,1" Angle="0" x:Name="rotation" />
...the lat & lon at the top left of the WPF Window do NOT update. They are tied to the non-animating coordinate system of the Viewport, which is illustrated by the little cube on the bottom right:

-As the cube is rotating, when I do click on it, based on the current coordinates of the Viewport (little cube, lat & lon at top left of display), that is where the tube will be placed. This means:

-No matter what the rotating sphere is doing, the lines will be placed on the sphere at THOSE coordinates, NOT where he mouse was over when clicked.
-Imagine you are clicking on a rotating sphere in the same (relative) position, say, dead center. The tube (flight) is placed on the rotating sphere, but always in the same place.

I know this is just a matter of finding a way in Helix 3D to rotate the little cube at the bottom right as the sphere is rotating, but there are virtually no docs, and I can't figure out how to tie them together.

Q: In other words, how to make the little navigation cube at the bottom right reflect the position of the sphere as it rotates? Or can I just get the Sphere coordinates from a click somehow?

An example might help.

Here is the window as it starts. Note the nav cube (we'll call it) position at bottom right, and where the sphere (earth) is in it's rotation, as well as the lat & lon at the top left:

You should notice a few things from the above 2 screenshots:
  1. The position of the mouse is (almost) the same, and so is the lat & lon at top left.
  2. The sphere, however, has rotated between these 2 images, and the mouse is obviously over a different lat & lon on the 2nd shot as it was in the 1st.
  3. The nav cube at bottom right stays still.

It appears that the nav cube is for the viewport coordinate system by default (would love to change that to the Sphere).
The Shpere, and it's child material) are rotating fine.
The click/hit point(s), though do NOT reflect the 3D point on the sphere clicked, but rather the 3D point in the viewport coordinate system.
I have even tried to rotate the camera around the Sphere, but I can't find documentation on how to accomplish this (leave Sphere static, rotate the camera in an orbit around sphere, looking at 0,0,0)...

My Request:

Please tell me how to tie the place the mouse is clicked on the sphere with the sphere coordinates of that click/hit.

Any hints, guidance, example greatly appreciated!

P.S. Is there an example of AnimateOpacity?

objo wrote at 2014-07-11 21:55:

Can anyone help on this? Is there a bug? Transform of the model/visual not being included?

BogusException wrote at 2014-09-18 19:00:

Still no thoughts on the question above:

P.S. Is there an example of AnimateOpacity?

SphereVisual3D+ImageBrush incompatible with cutting planes?

Anonymous 10 years ago updated by ไอยดา สุรีวงค์ 3 years ago 2
This discussion was imported from CodePlex

joaoportela wrote at 2012-03-13 20:55:

I was making half a sphere by cutting a SphereVisual3D with a cutting plane, but this made my (textured) sphere disappear. The sphere renders normally without cutting planes

So that I am clear:

This works:

MyVisual = new SphereVisual3D() { ThetaDiv = 60, PhiDiv = 30 };
var textureBrush = new ImageBrush(new BitmapImage(TEXTURE_URI));
MyVisual.Fill = textureBrush;

This doesn't:


MyVisual = new SphereVisual3D() { ThetaDiv = 60, PhiDiv = 30 };
var textureBrush = new ImageBrush(new BitmapImage(TEXTURE_URI));
MyVisual.Fill = textureBrush;
HelixToolkit.Wpf.CuttingPlaneGroup cutPlane = new CuttingPlaneGroup();
cutPlane.CuttingPlanes.Add(new Plane3D(new Point3D(0, 0, 0), new Vector3D(0, 0, -1)));

If I use a SolidColorBrush it works fine with or without cutting planes.
Am I doing something wrong or is this a know issue?



objo wrote at 2012-03-14 00:33:

this is probably a limitation with the current implementation. I have not tested it with texturecoordinates and imagebrush materials.

joaoportela wrote at 2012-03-21 12:06:

Thanks for the information. I ended up manually building an half sphere for this.


WPF: After deploy strange camera position and Zoom don't work

Anonymous 10 years ago 0
This discussion was imported from CodePlex

xpix wrote at 2014-03-11 16:08:


i have some problem with a deployed application. Here my XAML for helix3d toolkit:
            <helix:HelixViewport3D x:Name="viewport" Grid.Column="0" Grid.ColumnSpan="3" Margin="5,0,0,0">
                    <PerspectiveCamera x:Name="camera" 
                <helix:ArrowVisual3D Point1="0,0,0" Point2="10 0 0" Fill="Red" Diameter="0.5" />
                <helix:ArrowVisual3D Point1="0,0,0" Point2="0 10 0" Fill="Green" Diameter="0.5"/>
                <helix:ArrowVisual3D Point1="0,0,0" Point2="0 0 10" Fill="Blue" Diameter="0.5"/>
                <helix:PipeVisual3D x:Name="Tool3D" Diameter="1" Point1="0,0,0" Point2="0,0,5" Visible="False" >
                        <SolidColorBrush Color="#FFFF8B00" Opacity="0.5"/>
                <helix:GridLinesVisual3D Thickness="0.1" Center="0,0,0" Fill="#FFDADADA" Width="1000" Length="1000" />
The Application works perfect in VS 2013, the camera position is on top and scroll whel work. But if i deploy/publish this application and open it from published directory then the camerposition are strange (close to null and a perspective view) and the zoom function with the mouse scroll wheel don't work.

It's a bug or Feature? May i can make a video to demonstrate this effect?

What are the Best Types of Charging Electronics?

Ryan Oslov 7 years ago updated by yanre strength 3 years ago 2

Why are Charging Electronics so Important?

Charging has become something that almost all of us do and the reason being is that we have lots of devices that require charging. Many of us use smartphones, tablets, and laptops every day and that means we have to make sure that we’re using the correct charging electronic to charge them. With that said, as important as charging has become, the main problem right now is that we don’t really have an idea about what charging electronics on the market are best to use. Since charging is something that has become so normal, we don’t look past what we already use.

That is why it’s important to know what other charging options are available is because there are many other innovations than just the traditionally used USB wall chargers.

What are the Types of Charging Electronics and their Uses?

  1. Power banks are one of the fastest growing charging electronics on the market and the main reason for that is because they’re portable chargers. For the most part, many of us use USB wall chargers and as powerful as those are, in the end, we have to stay stationary to charge our devices while we charge. This can be a problem if we have places to be and things to do. One of the best things about power banks is how their popularity is growing and there are quite a lot of different types of power banks that are now on the market. The two main types are power banks that have low power capacities and ones that have high power capacities. The most used ones are ones that have low power capacities and that’s because they have smaller sizes and lighter weights; so they’re able to fit into your pocket. Then there are power banks that have higher power capacities and those are heavier ones. They have lots of power and more charging ports.
  2. The next most useful charging electronic are Surge Protectors that use USB charging ports. The reason behind that is because most of us use USB wall chargers with our Surge Protectors and that can be a problem. As some USB wall charger cover surrounding AC Outlets on these Surge Protectors and the fact that you’re using an AC Outlet dedicated to just a USB charger. Instead, according to there are now Surge Protectors that use USB ports directly on them and that means you longer have to use a USB wall charger with the Surge Protectors since it’s already built-in. These type of Surge Protectors have multiple charging ports and they also make use of special charging tech like Quick Change to charge those compatible smartphones at their max charging speeds.
  3. One of the most reliable sources of power is the sun and in this case, Solar chargers are able to make use of the sun quite well. That’s because Solar chargers have multiple large solar panels that are able to capture lots of Solar rays at once and that means you can make the most out of the charging experience. Solar chargers are also quite portable because they have lanyard holes that are able to connect to a backpack, have the Solar panels facing out and capture as much as the sun as possible while you’re on the move.



Router Problems and How To Solve It

Robert 7 years ago updated by ไอยดา สุรีวงค์ 3 years ago 1

Normally internet is very important to access various applications and services to access. The computer is common device that is pretty essential to access internet with effective speed. The router is another fundamental home network device that is very useful to share internet for multiple devices such as computers, and mobile phones. The ip address is a common identity factor and every system and router has an ip address to allow internet signals. The internet security should be protected otherwise hackers easily hack important and secret informations. The router device need to be password protected otherwise unauthorized people can access resource of internet. The router users have to access router interface to make a strong password settings. The default gateway ip address of router must be a private ip address because routers settings only used for local network settings. The ip address is one of the general private ip addresses and many router manufacturers are using this particular ip address. The router may get some problem while using but users must know how to resolve those problems.

Image 50

The Common Router Problems

Today most of small scale companies and people are using router to access decent speed internet services. The router is pretty helpful to access internet within a particular range limit. The users should not forget default ip address of device to avoid complications in accessing settings. The users should connect a modem with router to make an effective connection establishment. There are many router problems highly affects device and internet connection so they have to fix those issues.

These problems are common router issues. Router interface issue is a most common issue that can be solved by various techniques. The users can’t get users interface of router by several reasons. They have to feed ip address and if any spelling mistakes then user can’t access router interface. The users should use faster web browser to access router interface and bad and slow web browsers may be reason for can’t access router interface. However users should check the router and computer connection establishment to avoid significant accessing problems. The users should unplug the modem during the access of router interface for safety precautions. The router setting can’t access problems also caused by no internet connection issue, computer firewall blocking issues and password loss and password changed problems.