0
Under review

Clickability between oxyplot and helix

pettertho 10 years ago updated by anonymous 8 years ago 9
Hi all,

Me and a classmate are doing a school assignment where we're asked to develop av visualizing program for tomography data. For parts of this project we are using Helix and Oxyplot. Helix to model the pipes in question with Pipe3D, and Oxyplot to plot an intensity chart for the data set.
The data set is several 2D matrices, one for each measurement. This is processed into a bitmap which is displayed onto the 3D model.

We are using MVVM as our design pattern.

We want to introduce clickability and interactivity between these two parts. When a point or a field is marked with the pointer in the oxyplot, the same area should also be marked on the Helix model, or vice versa. The question is which is the best way to do this, and how does one do it? (bear in mind we are not experienced programmers)

Please feel free to ask any questions!
+1
Congratulations; you have crossed into the realm of Real Computer Science, where Easy Answers Do Not Exist.

What it sounds like you want is called "Hit Testing". https://msdn.microsoft.com/en-us/library/ms752097.aspx

Fortunately, it is baked into WPF, including WPF 3D. This will mean that you have to use Helix/WPF rather than Helix/SharpDX. You will need to develop your own linear interpolations to interpret the results you get back from the HitTest() methods so that you can program the functionality you want against the click methods.

As design patterns go, MVVM is relatively advanced. You will very steadily become experienced programmers this way.
Thank you for your reply.

I've now read the link you sent me. I recently purchased a book called WPF 4.5 Unleashed, which has been very helpful and also touches on this topic. This seems like what we need.

I understand that we will be using hit testing to recognize where the clicks are performed, and if it is of importance regarding our program layout. We are, by the way, strictly using Helix/WPF for 3D-purposes, as well as Oxyplot for the charting bit.

I do not understand what you mean by creating a linear interpolation to interpret the hit test results.
If I were to click the 2D bitmap, and wanted the same point to be marked on the 3D model corresponding with the .bmp, is this where the interpolation is necessary?

Beneath is an early version (before MVVM) where you can see the 2D and 3D representation. The 2D is a Image object/BitmapSource, and the 3D model is a Pipe3D (Helix) with the bitmap wrapped around it, in a HelixViewport3D.



"If I were to click the 2D bitmap, and wanted the same point to be marked on the 3D model corresponding with the .bmp, is this where the interpolation is necessary?"

That's right. But, you're the only one who knows what that correspondence looks like. MVVM architecture models will help you choose a place in your code for that kind of computation, but, again, the decision process is yours to make. Under MVVM, the interpolation probably goes best in the ViewModel.
Alright, thank you.
Under review
An alternative to wrapping a bitmap on the 3D model could be to use a linear gradient brush (defined by the same colors as the heat map rendered by OxyPlot) and set texture coordinates on the 3D geometry.

But your solution using a bitmap may be slightly easier when you want to transfer 3D hit results (given as vertex index/weights) - then you can calculate the 2D coordinate from the texture coordinates and the hit test vertex weights.

ps. I like "AltMuligKnappen"!
I'm not sure if I've understood this correctly. We've tried drawing the pipe using polygons and mesh from WPF 3D to define each "pixels" color, but the performance was not great. Is the PipeVisual3D in fact a meshbuilder, that simplifies drawing of pipes?

One thing I did not mention before was that the data set we receive is processed with bilinear interpolation, to give a nicer view.

  • When drawing the pipe with the Helix toolkit, are we able to extract coordinates on the model? In a way using the model itself as a coordinate system to plot the different color values directly to the model with a brush? I am not sure if we need to use linear gradient brush, as the resolution of the matrix is good enough after the interpolation I mentioned.

  • The pipe has the same dimensions as the bitmap, it is as long as the bitmap is wide and has a radius which is the bitmap height divided by 2 * pi. When translating a 3D-coordinate from a MouseDown event in the viewport the Y-coordinates will be the same on in both viewport and bitmap. To get the height-coordinate in the bitmap, we used the following method:
public static double GetRow(double x, double z, Measurement m) {
            double
                    noRows = m.WallLossMatrix.rows,
                    radius = noRows / (Math.PI * 2);

            if (z >= 0) {   // First and second quadrant
                if (x >= 0) {   // First quadrant
                    return Math.Asin(x / radius) * radius;
                }
                else {   // Second quadrant
                    return (2 * Math.PI - Math.Asin(-x / radius)) * radius;
                }
            }
            else {   // Third and fourth quadrant
                if (x < 0) {   // Third quadrant
                    return (Math.PI - Math.Asin(x / radius)) * radius;
                }
                else {   // Fourth quadrant
                    return (Math.PI - Math.Asin(x / radius)) * radius;
                }
            }
        }

We are not familiar with the term hit test. To get the 3D-coordinate on a MouseDown we used this event:

private void hViewport_MouseDown(object sender, MouseButtonEventArgs e) {
            var pt = hViewport.FindNearestPoint(e.GetPosition(hViewport));
            if (pt.HasValue) {
                (...)
            }
        }
Is this hit testing, or is there a better way?


A lot of questions appear for me, some of which may not be fitting, but please know that all help is very much appreciated!

Below is an example picture from a bitmap representation of one measurement, after and before interpolation (the matrix is about 12 times the size of the original):



ps, that button can do all kinds of things!
The `HelixToolkit.Wpf.Viewport3DHelper.FindHits` method is helpful when working with hit testing. It uses VisualTreeHelper.HitTest but creates an ordered list instead of depending on callbacks.

First you need to subscribe to mouse events, either on the viewport or on an UIElement3D. Here is an example:
<Window x:Class="HitTestDemo.Window1"
        xmlns="http://schemas.microsoft.com/winfx/2006/xaml/presentation"
        xmlns:x="http://schemas.microsoft.com/winfx/2006/xaml"
        xmlns:h="http://helix-toolkit.org/wpf"
        Title="Hit test demo" Height="480" Width="640">
    <!-- Subscribe to MouseDown events on the Viewport -->
    <h:HelixViewport3D x:Name="viewport1" MouseDown="OnViewportMouseDown">
        <h:DefaultLights/>
        <!-- Set the back material to null to exclude hits on the inside of the geometry -->
        <h:PipeVisual3D Point1="-10,0,2" Point2="10,0,2" BackMaterial="{x:Null}" ></h:PipeVisual3D>
        <!-- The geometry is created in code behind, subscribe to the MouseDown event 
        (when these events are handled, the viewport will not receive the MouseDown event) -->
        <ModelUIElement3D x:Name="element1" MouseDown="OnVisualMouseDown"></ModelUIElement3D>
    </h:HelixViewport3D>
</Window>
The code-behind (sorry no MVVM in this example) creates the geometry for the second visual, and uses the FindHits method to calculate the texture coordinates of the hit points:
namespace HitTestDemo
{
    using System.Diagnostics;
    using System.Windows;
    using System.Windows.Input;
    using System.Windows.Media.Media3D;
    using HelixToolkit.Wpf;
    public partial class Window1
    {
        public Window1()
        {
            this.InitializeComponent();
            // Create a pipe geometry and model
            var mb = new MeshBuilder(false, true);
            mb.AddPipe(new Point3D(-10, 0, 0), new Point3D(10, 0, 0), 0, 1, 50);
            var model = new GeometryModel3D(mb.ToMesh(), Materials.Green);
            // Assign the model to the ModelUIElement3D
            this.element1.Model = model;
        }
        // this is called when the user clicks on the ModelUIElement3D visual
        private void OnVisualMouseDown(object sender, MouseButtonEventArgs e)
        {
            var position = e.GetPosition(this.viewport1);
            this.CalculateHitTextureCoordinates(position);
            e.Handled = true;
        }
        // this is called when the user clicks on the 3D viewport (also outside the visual model)
        private void OnViewportMouseDown(object sender, MouseButtonEventArgs e)
        {
            var position = e.GetPosition(this.viewport1);
            this.CalculateHitTextureCoordinates(position);
            e.Handled = true;
        }
        private void CalculateHitTextureCoordinates(Point position)
        {
            // loop over all hits (sorted by distance from camera, hidden models will also be included in this sequence)
            foreach (var hit in this.viewport1.Viewport.FindHits(position))
            {
                // get the texture coordinates of the triangle that was hit
                var tc1 = hit.Mesh.TextureCoordinates[hit.RayHit.VertexIndex1];
                var tc2 = hit.Mesh.TextureCoordinates[hit.RayHit.VertexIndex2];
                var tc3 = hit.Mesh.TextureCoordinates[hit.RayHit.VertexIndex3];
                // calculate the texture coordinate of the hit point by linear interpolation
                var tc =
                    new Point(
                        (tc1.X * hit.RayHit.VertexWeight1) + (tc2.X * hit.RayHit.VertexWeight2) + (tc3.X * hit.RayHit.VertexWeight3),
                        (tc1.Y * hit.RayHit.VertexWeight1) + (tc2.Y * hit.RayHit.VertexWeight2) + (tc3.Y * hit.RayHit.VertexWeight3));
                Debug.WriteLine("Texture coordinate: " + tc);
            }
        }
    }
}
I am sure this example can be improved/simplified a lot, but it was the first I could think of :-)

See the SurfacePlot demo for an example on how to use a linear gradient brush and texture coordinates to create a color coded mesh! https://github.com/helix-toolkit/helix-toolkit/tree/master/Source/Examples/WPF/ExampleBrowser/Examples/SurfacePlot
First of all, thank you for taking the time to help us, it is much appreciated.

After some testing with your example, and reading up on texture coordinates, we are now more familiar with what your example does.

When hit testing we can get the texture coordinates from the point that has been clicked. The question now is how to map these coordinates on the model so that they correspond to actual "pixels" and values from the measurement matrices, and how to build the pipemodel correctly.

The next thing is to map all the "pixels" color values onto the pipemodel, giving a good visual representation of the data.

After looking at the SurfacePlot example we can get an idea of how this is done, but we are unsure how to translate this to fit in with our task. We want to implement this functionality in our program, and also the meshgrid lines, which can be a helpful visual in our case. Do you know of any simpler examples on how to do this, or is this already a simple example?

So, to clarify:
  • How do we map the texture coordinates in the correct way on the/a pipe?
    • We have a data set which specifies the measurement area
  • How do we draw the colors (from our measurement values) to each pixel on the pipe?

Thinking about it, what we're actually asking is how do we combine the functionality between the two examples you've mentioned?

Ideally, we would have to get the hits from two triangles, that specify one pixel/measurement value. I think we should be able to figure this one out.

Again, please ask should anything be unclear.