This is the discussion forum for Helix Toolkit.
For bugs and new features, use the issue tracker located at GitHub.
Also try the chat room!
0

using keys to move camera while in drag mode

Anonymous 11 years ago 0
This discussion was imported from CodePlex

mihaipruna wrote at 2014-08-11 20:30:

I should be able to move and rotate the camera with the keyboard at least even while doing drag-drop operations, is there a way to retain that functionality?
0

Order Indiependent Transparency

Terry Price 10 years ago updated by ไอยดา สุรีวงค์ 4 years ago 1
Hello,

Does Helix 3D support any form of OIT such as depth-peepling?

If not, are there any plans to implement it in the future? It would be hugely helpful for my applicaiton.

If not, is there any way that I can 'tack it on' to the existing framework?

Best Wishes,

-TJP
0

Point Cloud using Helix Toolkit

Anonymous 11 years ago 0
This discussion was imported from CodePlex

Luka1211 wrote at 2013-10-02 20:50:

Hello,

i am new to c# and WPF, and i am trying to use helix toolkit to represent a kinect depth image as a point cloud. i am trying to load Points3DCollection into PointsVisual3D as it is explained in the source for the pointsandlines example. It keeps returning the message that it cannot implicitly convert points3DCollection to pointsvisual3D. Is there an example just for creating point clouds or just points using the mentioned methods?

Thx

abuseukk wrote at 2013-12-25 18:54:

Hi,

I am also trying to create a point cloud for the human body using Kinect and Helix 3d. For now there is a sample in the examples directory which I think has some mistakes in converting depth to 3d mesh cordinates. I found this link useful :

Kinect Stackoverflow
using (DepthImageFrame depthFrame = e.OpenDepthImageFrame()) {
  DepthImagePixel[] depth = new DepthImagePixel[depthFrame.PixelDataLength];
  SkeletonPoint[] realPoints = new SkeletonPoint[depth.Length];

  depthFrame.CopyDepthImagePixelDataTo(depth);

  CoordinateMapper mapper = new CoordinateMapper(sensor);
  mapper.MapDepthFrameToSkeletonFrame(DEPTH_FORMAT, depth, realPoints);
}
If someone can help me with converting this cordinates into correct Point3d Cordinates :)

AnotherBor wrote at 2014-01-07 05:33:

Luka1121, PointsVisual3D has a Points property to which you should assign your collection of points.

Abuseukk, you should probably decide to use the depth data as one of the coords of the points. new Point3D(xcoord, ycoord, depth[i]) would spread the points in Z axis accordingly to their depth.
0

CameraController not defined

Anonymous 11 years ago 0
This discussion was imported from CodePlex

ppcomma wrote at 2011-12-21 21:07:

Hi!

Hi! I need to add HelixView3D on my form in the code. so, run-time error:: "cameracontroller not defined". What should I do?


objo wrote at 2011-12-21 21:44:

Are you using an old version? The control has been renamed to HelixViewport3D.

Are you calling any methods on the control before the control template has been applied? (the camera controller is undefined until the template is applied).

In what part of the code are you adding the control?

Is it possible to create the control in the view XAML? See the examples.


ppcomma wrote at 2011-12-21 22:13:

this is my code.. i have to add HVP3D dynamically..

string
[] AllFiles = Directory.GetFiles(@"C:\Users\..", "*.3ds"); HelixViewport3D[] hel_ar = new HelixViewport3D[AllFiles.Count()]; for (int i = 0; i < AllFiles.Count(); i++) { hel_ar[i] = new HelixViewport3D(); PerspectiveCamera myPCamera = new PerspectiveCamera(); myPCamera.Position = new Point3D(10000, 0, 0); myPCamera.LookDirection = new Vector3D(-1, 0, 0); myPCamera.FieldOfView = 60; hel_ar[i].Viewport.Camera = myPCamera; DirectionalLight myDirectionalLight = new DirectionalLight(); myDirectionalLight.Color = Colors.White; myDirectionalLight.Direction = new Vector3D(-0.61, -0.5, -0.61); hel_ar[i].Viewport.Height = 115; hel_ar[i].Viewport.Width = 115; Model3DGroup m3d = new Model3DGroup(); m3d = ModelImporter.Load(AllFiles[i]); m3d.Children.Add(myDirectionalLight); ModelVisual3D m = new ModelVisual3D(); m.Content = m3d; hel_ar[i].Viewport.Children.Add(m); stackPanel1.Children.Add(hel_ar[i]); }


ppcomma wrote at 2011-12-21 22:22:

Thanks, i did it


ppcomma wrote at 2012-01-28 15:52:

One more question..

I create HelixViewport3D in C# code and import 3ds model. When i run project, HelixViewport does not show anything until doubleMMB.

What should i do to fix it? i want helixviewport to show my model at once.


objo wrote at 2012-01-29 09:20:

It seems like your camera position/look direction is wrong. Try to use the ZoomExtents or ResetCamera method in HelixViewport3D after importing the 3ds model.

There is currently not a "ModelLoaded" event on the FileModelVisual3D, I think that is something that should be added.

0

Multiline Closed Path Polygons

Michael Powell 10 years ago updated by ไอยดา สุรีวงค์ 4 years ago 1
Hello,

Please pardon my clumsy use of verbiage here...

I will probably default to the RectangleVisual3D, initially, but I'd like to know whether there is something like a closed-path LinesVisual3D. Could work its way out in 3D, in my case it would be along a 2D plane.

Basically I'd like to render things like triangles, trapezoids, etc. Then be able to potentially specify things like line issues (i.e. brush), and fill.

Thank you...
0

Help needed rendering Kinect data with HelixToolkit.Wpf.SharpDX

Anonymous 11 years ago 0
This discussion was imported from CodePlex

Ehee wrote at 2014-05-27 18:09:

I am trying to build a working example program to render Kinect data using the Helix SharpDX API and I need the correct steps for continually rendering a changing model. I have the old Kinect example from the WPF API working and I can generate a 3D model from the Kinect however all of the SharpDX examples use static models that are created in the MainViewModel constructors prior to rendering. Could someone provide the most efficient steps to continually rendering a changing model using the Helix SharpDX API?
0

When does Viewport3D.CameraController gets initialized?

Anonymous 11 years ago 0
This discussion was imported from CodePlex

mplisov wrote at 2013-03-11 22:04:

Greetings,

Thanks for your great toolkit.

Please explain a little bit when does CameraController get initialized? It seems to be null on Viewport3D creation and I need to set some properties to it when programmatically creating Viewport3d.

Best regards,
Mikhail

objo wrote at 2013-04-15 13:01:

For HelixViewport3D the CameraController is defined in the control template. You can get the reference after the OnApplyTemplate method has been called.
Can the DefaultCamera property help in your case?
0

How to set camera in helixviewport3d

Anonymous 11 years ago 0
This discussion was imported from CodePlex

charismatubagus wrote at 2013-01-25 08:19:

I am newbie in helixtoolkit,

I wanna know how I can control Camera to view only R/L/U/D at the beginning. How can I do it in XAML?

 

Thanks


objo wrote at 2013-02-07 21:08:

The HelixViewport3D control has "Camera" and "DefaultCamera" properties.

See examples in
Source\Examples\ExampleBrowser\Examples\SurfacePlot\MainWindow.xaml
and
Source\Examples\ExampleBrowser\Examples\CameraControl\MainWindow.xaml

The DefaultCamera property sets the camera properties used when resetting the view.
0

strange behavior when i put two RectangleVisual3D nearly

Anonymous 11 years ago 0
This discussion was imported from CodePlex

qjt wrote at 2014-03-05 03:14:

Hi,when i am using your toolkit.A strange thing occurred. i put two RectangleVisual3D of different color (one of them is RED ,the other is BLUE) closely, one RectangleVisual3d looked like dissolve into another RectangleVisual3d just like the picture below:

the code:
 <t:RectangleVisual3D Length="100" Width="100" Origin="0,0,1" Fill="#FFF50404" >
            </t:RectangleVisual3D>
            <t:RectangleVisual3D Length="100" Width="100" Origin="0,0,1.0001" Fill="#FF0611FB">
            </t:RectangleVisual3D>

qjt wrote at 2014-03-05 03:17:

another picture:


Lolipedobear wrote at 2014-03-05 11:53:

Did you set NearPlaneDistance in Camera? Had the same problem when NearPlaneDistance was 0.

qjt wrote at 2014-03-06 03:40:

Thank you for your help,but it can't solve the problem.The code:
 this.Position = camera.Position;
            this.LookDirection = camera.LookDirection;
            this.UpDirection = camera.UpDirection;
            //this.NearPlaneDistance = camera.NearPlaneDistance;
            this.NearPlaneDistance = 0;
            this.FarPlaneDistance = camera.FarPlaneDistance;

Lolipedobear wrote at 2014-03-06 06:05:

Set value different from 0. Chosen value is depends on situation - if scene is large value can be 10 or more, if it is a small scene then value can be 0.1. I set 15 to NearPlaneDistance and it resolved problem on 1500x1500 units scene. Parts of the scene closer to camera than value in NearPlaneDistance property won't be displayed. As far as i know this problem is connected with floating point rounding when renderer is trying to determine which object is closer to camera.
<PerspectiveCamera Position="0, 600, 525" LookDirection="0, -600, -525" UpDirection="0,1,0" NearPlaneDistance="15"/>
0

Getting all vertices from MeshBuilder?

Ardahan 9 years ago updated by ไอยดา สุรีวงค์ 4 years ago 3

I want to get all vertices from my MeshBuilder sample. In meshbuilder there is list of normals and positions. Are these positions defining triangle vertices . I am confused because in my cube model there is 36 triangleindices and positions and normals. Each one is 36. So are these positions represent each vertices or what? Should normal count equal triangle count because each triangle have one normal and three vertices.