Hi

I am not really sure, you have a vertex somewhere in blender, so a 3d coordinate (x,y,z) and you want to know where it will be presented on your screen / render result?

At least therefore you definitely need to know mathematics and how camera and projection works. And then write an own plugin (I do not know sth out of the box)

- First you need to find your coordinates of only one vertex, like in the middle of the eye ... So in a mesh you must know the vertex number and then you get the position (be aware blender has local (object) and world space). You need it in world space. Lets call that a(x,y,z)
- Then you need the position of the camera also in world space c(x,y,z)
- Then you need the angle or orientation of the camera theta(x,y,z)
- and then the display surface ... e(x,y,z)

When you get that all, there is a formula for perspective projection

https://en.wikipedia.org/wiki/3D_projectionthe result would be b(x,y) in the formula.

This is the mathematical correct approach. But the "weak" method is normally sufficient, you can check here:

(it uses a focal length)

https://math.stackexchange.com/questions/2305792/3d-projection-on-a-2d-plane-weak-maths-ressourcesAnd here:

https://stackoverflow.com/questions/724219/how-to-convert-a-3d-point-into-2d-perspective-projectionAnd:

https://stackoverflow.com/questions/701504/perspective-projection-determine-the-2d-screen-coordinates-x-y-of-points-in-3I once tried to calculate the camera distance from the object, so that it is completely visible using a bounding cube. And my CAD math is a bit rusty unfortunately. So I needed some tries to get a result. The problem was similar in a way. But I do not know an easy way.

I am not 100% sure if there are some additional functions in Blender, which can be called, when a plugin is written. The reason is, that the rendering is usually done the other way round. Like starting from a plane (your screen) and tracing where the ray hits ...

Experts somewhere?