Your comments
Hi Mark, there was an issue with the wrong BrowsableAttribute being used in the models. The PropertyGrid is set to require [System.ComponentModel.Browsable]. See the updated example source code on github.
I think this is a bug, I guess some of the bindings in the CombinedManipulator constructor are not reconstructed when re-enabling the rotation. Can you add the issue at GitHub?
Isn't Open Inventor based on OpenGL? I guess there is not much interoperability between these two...
The `HelixToolkit.Wpf.Viewport3DHelper.FindHits` method is helpful when working with hit testing. It uses VisualTreeHelper.HitTest but creates an ordered list instead of depending on callbacks.
First you need to subscribe to mouse events, either on the viewport or on an UIElement3D. Here is an example:
See the SurfacePlot demo for an example on how to use a linear gradient brush and texture coordinates to create a color coded mesh! https://github.com/helix-toolkit/helix-toolkit/tree/master/Source/Examples/WPF/ExampleBrowser/Examples/SurfacePlot
First you need to subscribe to mouse events, either on the viewport or on an UIElement3D. Here is an example:
<Window x:Class="HitTestDemo.Window1" xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation" xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml" xmlns:h="http://helix-toolkit.org/wpf" Title="Hit test demo" Height="480" Width="640"> <!-- Subscribe to MouseDown events on the Viewport --> <h:HelixViewport3D x:Name="viewport1" MouseDown="OnViewportMouseDown"> <h:DefaultLights/> <!-- Set the back material to null to exclude hits on the inside of the geometry --> <h:PipeVisual3D Point1="-10,0,2" Point2="10,0,2" BackMaterial="{x:Null}" ></h:PipeVisual3D> <!-- The geometry is created in code behind, subscribe to the MouseDown event (when these events are handled, the viewport will not receive the MouseDown event) --> <ModelUIElement3D x:Name="element1" MouseDown="OnVisualMouseDown"></ModelUIElement3D> </h:HelixViewport3D> </Window>The code-behind (sorry no MVVM in this example) creates the geometry for the second visual, and uses the FindHits method to calculate the texture coordinates of the hit points:
namespace HitTestDemo { using System.Diagnostics; using System.Windows; using System.Windows.Input; using System.Windows.Media.Media3D; using HelixToolkit.Wpf; public partial class Window1 { public Window1() { this.InitializeComponent(); // Create a pipe geometry and model var mb = new MeshBuilder(false, true); mb.AddPipe(new Point3D(-10, 0, 0), new Point3D(10, 0, 0), 0, 1, 50); var model = new GeometryModel3D(mb.ToMesh(), Materials.Green); // Assign the model to the ModelUIElement3D this.element1.Model = model; } // this is called when the user clicks on the ModelUIElement3D visual private void OnVisualMouseDown(object sender, MouseButtonEventArgs e) { var position = e.GetPosition(this.viewport1); this.CalculateHitTextureCoordinates(position); e.Handled = true; } // this is called when the user clicks on the 3D viewport (also outside the visual model) private void OnViewportMouseDown(object sender, MouseButtonEventArgs e) { var position = e.GetPosition(this.viewport1); this.CalculateHitTextureCoordinates(position); e.Handled = true; } private void CalculateHitTextureCoordinates(Point position) { // loop over all hits (sorted by distance from camera, hidden models will also be included in this sequence) foreach (var hit in this.viewport1.Viewport.FindHits(position)) { // get the texture coordinates of the triangle that was hit var tc1 = hit.Mesh.TextureCoordinates[hit.RayHit.VertexIndex1]; var tc2 = hit.Mesh.TextureCoordinates[hit.RayHit.VertexIndex2]; var tc3 = hit.Mesh.TextureCoordinates[hit.RayHit.VertexIndex3]; // calculate the texture coordinate of the hit point by linear interpolation var tc = new Point( (tc1.X * hit.RayHit.VertexWeight1) + (tc2.X * hit.RayHit.VertexWeight2) + (tc3.X * hit.RayHit.VertexWeight3), (tc1.Y * hit.RayHit.VertexWeight1) + (tc2.Y * hit.RayHit.VertexWeight2) + (tc3.Y * hit.RayHit.VertexWeight3)); Debug.WriteLine("Texture coordinate: " + tc); } } } }I am sure this example can be improved/simplified a lot, but it was the first I could think of :-)
See the SurfacePlot demo for an example on how to use a linear gradient brush and texture coordinates to create a color coded mesh! https://github.com/helix-toolkit/helix-toolkit/tree/master/Source/Examples/WPF/ExampleBrowser/Examples/SurfacePlot
An alternative to wrapping a bitmap on the 3D model could be to use a linear gradient brush (defined by the same colors as the heat map rendered by OxyPlot) and set texture coordinates on the 3D geometry.
But your solution using a bitmap may be slightly easier when you want to transfer 3D hit results (given as vertex index/weights) - then you can calculate the 2D coordinate from the texture coordinates and the hit test vertex weights.
ps. I like "AltMuligKnappen"!
But your solution using a bitmap may be slightly easier when you want to transfer 3D hit results (given as vertex index/weights) - then you can calculate the 2D coordinate from the texture coordinates and the hit test vertex weights.
ps. I like "AltMuligKnappen"!
Could you apply the inverse transform to the cutting plane? (workaround since the cut method is not supporting transforms)
I guess the triangulator is not supporting texture coordinates. That's a new feature that could be added, I think.
This is a general XAML/WPF question, it may be better to try other forums!
Customer support service by UserEcho
Good idea, create an issue (feature request) on github!