查看单个帖子
旧 2009-05-05, 11:51 AM   #1
yang686526
高级会员
 
注册日期: 06-11
帖子: 14579
精华: 1
现金: 224494 标准币
资产: 234494 标准币
yang686526 向着好的方向发展
默认 【转帖】hatch in vectorization

hatch in vectorization
hatch in vectorization
hi,
has anyone tried rendering/importing hatches in the vectorization framework?
the triangle tessallation done by odgsbasevectorizeview::draw() is not what we want. hence we're trying to map the hatch loops/fill/gradients/patterns ourself during import.
my real question is there any way to transform the 2d entities forming the loops to a subclass of odgidrawable so that we can let odgsbasevectorizeview::draw() handle the details of where to position the entity on the screen?
if not, what all transformations do i have to apply to the 2d entity so that it is in the same view as those which are received in my implementations of circleproc(), polygonproc() etc?
thanks

ok, i finally went thru the autocad documentation and got the reference to "arbitrary axis algorithm", which helped me understand why the loops are in 2d. based on my limited understanding i wrote the following code:
code:
hatchimport(const oddbhatch *phatch)
{
odgeplane hatchplane;
oddb:lanarity planarity;
phatch->getplane(hatchplane, planarity);
odgematrix3d ocs2wcsmatrix = odgematrix3d:lanetoworld(hatchplane);
odgematrix3d ocs2screenmatrix = ocs2wcsmatrix * m_world2screenmatrix;
for (int loopindex = 0; (loopindex < phatch->numloops()); loopindex++)
{
odint32 type = phatch->looptypeat(loopindex);
if (type & oddbhatch::kpolyline)
{
odgepoint2darray vertices;
odgedoublearray bulges;
phatch->getloopat(loopindex, vertices, bulges);
for (unsigned int i = 0; (i < vertices.length()); i++)
{
odgepoint3d v3d = convert2dpointto3dpoint(vertices[i]);
v3d.transformby(ocs2screenmatrix);
// process v3d for my app
}
}
else
{
// similarly for non-polylines
}
}
}
where i am calculating the m_world2screenmatrix with the following:
code:
void exsimpleview::update()
{
device()->setdrawcontext(drawcontext());
odgematrix3d eye2screen(eyetoscreenmatrix());
seteyetooutputtransform(eye2screen);
odgematrix3d world2eye = viewingmatrix();
odgematrix3d world2screen = world2eye.postmultby(eye2screen);
// set world2screen as m_world2screenmatrix
odgsbasevectorizeview::update();
}
of course, its not working as expected or else i would not be posting this here
i have followng queries:
1. is the calculation of ocs2wcsmatrix in the first code chunk correct? that is, will the matrix when applied actually convert a point in the hatch ocs to its actual wcs coordinates? (somehow i get the feeling that it is not taking the elevation of the hatch plane into consideration. but then this is just a wag)
2. is calculation of m_world2screenmatrix correct? i want a matrix to convert wcs coordinates into viewing coordinates.
someone please help!
i think the order is incorrect:
try
code:
odgematrix3d ocs2screenmatrix = m_world2screenmatrix * ocs2wcsmatrix;
sergey slezkin
changing the order is not working. its not giving correct result even when we try to import the top view (m_world2screenmatrix is identity matrix).

it does work!! well almost anyway.
we were multiplying the matrices in the wrong order during circle import as well and hence my earlier observation that it didn't work. sorry about that.
now in all perpective views, the scale, rotation and shear are being mapped correctly. but we're still having problems with translation.
here's what we're doing now:
1. we map the loops in the hatch (in ocs) and color/gradient/pattern to our application objects.
2. we calculate the 3d transformation matrix for mapping from hatch ocs to view coordinates:
code:
odgeplane hatchplane;
oddb:lanarity planarity;
phatch->getplane(hatchplane, planarity);
odgematrix3d ocs2wcsmatrix = odgematrix3d:lanetoworld(hatchplane);
odgematrix3d ocs2screenmatrix = ocs2wcsmatrix * m_world2screenmatrix
3. we convert the 3d transformation to 2d transformation (as all objects in our app are 2d):
code:
appmatrix mat;
mat.a = ocs2screenmatrix[0][0];
mat.b = ocs2screenmatrix[1][0];
mat.c = ocs2screenmatrix[0][1];
mat.d = ocs2screenmatrix[1][1];
mat.tx = ocs2screenmatrix[0][3];
mat.ty = ocs2screenmatrix[1][3];
4. apply this transform to objects representing hatch in our app. the transformation works like:
code:
xnew = a * xold + c * yold + tx;
xnew = b * xold + d * yold + ty;
whats happening now is that, for instance we create a polyline and hatch it in autocad. upon import, the circle (which goes thru vectorization) and the hatch (which we map and apply transformations separately) are shifted relative to each other. in short, the tx and ty are not what they should be.
as i said earlier, the shape and size of both the polyline and the hatch are identical only the relative positions don't match up.
can anyone help?
thanks
(ps: while someone is at it, please also explain to me why the order of matrix multiplication should be the one sergey pointed out? we're trying to map from ucs to view coordinates and hence we wrote ocs2wcsmatrix * m_world2screenmatrix).
try the following:
code:
odgematrix3d ocs2wcsmatrix = odgematrix3d:lanetoworld(phatch->normal());
on the order of matrix multiplication:
order of multiplication defines the order in which transforms are applied to a vertex. the rigt most transform is applied first:
m = m1 * m2;
pt.transformby(m);
this is equivalent to applying the transforms in the following order:
pt.transformby(m2);
pt.transformby(m1);
sergey slezkin
last edited by mmuratov; 24th march 2006 at 06:39 amfff">.

hi sergey,
i tried using phatch->normal() but it didn't work the hatch now comes shifted by some other amount. am still struck.
in fact, the same thing is happening with other geometric figures as well. i tried mapping an instance of oddbellipse myself. here is what i did:
code:
void exsimpleview::draw(const odgidrawable* pdrawable)
{
...
odgsbasevectorizeview::draw(pdrawable); // vectorize the entity
...
if (pdrawable->isa()->isderivedfrom(oddbellipse::desc()))
{
const oddbellipse *ep = reinterpret_cast<const oddbellipse *>(pdrawable);

odgeelliparc3d ell(ep->center(),
ep->majoraxis(),
ep->minoraxis(),
ep->majoraxis().length(),
ep->minoraxis().length());
ell.transformby(m_world2screenmatrix);
// map ell the same way we are doing for the odgeelliparc3d parameter passed in exgssimpledevice::elliparcproc
}
}
where m_world2screenmatrix is calculated as
code:
void exsimpleview::update()
{
device()->setdrawcontext(drawcontext());
odgematrix3d eye2screen(eyetoscreenmatrix());
seteyetooutputtransform(eye2screen);
odgematrix3d world2eye = viewingmatrix();
odgematrix3d world2screen = world2eye.postmultby(eye2screen);
// set world2screen as m_world2screenmatrix
odgsbasevectorizeview::update();
}
the result is that i get two ellipses similar in size, shape and orientation but shifted with respect to each other. the shifting is not constant and seems to be dependant on the size of odgsdcrect we pass to pdevice->onsize().
is there some other transformations that you apply within the vectorizer? and if so how can i emulate that?
thanks
hi sergey,
any advice on this?

dear varunsnair,
model transform (block reference) + world to screen transform. no other transforms.
also please make sure that you're calling update() for wrapper device (that you get from setupactivelayoutviews), not for original device.
sincerely yours,
george udov
hi george,
we are not applying any block transformation. how do we get access to this transformation? we have an instance of oddbhatch passed to our device 's draw() method.
also do we apply the block transformation first or the world to screen transformation first?
thanks,
varun

hi george,
could you please guide me on how to incorporate the "model transform (block reference)" into the scheme of things here.
i tried looking up the block of the hatch and setting up a transformation matrix, but the only thing of relevence i saw in the block was the origin. and even for block origin at (0, 0, 0), no hatch is coming at it right place.
thanks,
varun

complete transformation (including both block and world2screen transforms) can be obtained from odgsview:bjecttodevicematrix().
and once more. are you sure you called update for wrapper device that was returned from setupactivelayoutviews()?
sincerely yours,
george udov

thanks george,
objecttodevicematrix() looks like what we require. but am i correct in assuming that the correct usage is:
code:
odgematrix3d ocs2screenmatrix = ocs2wcsmatrix * m_objecttodevicematrix();
that is, we still need to prepend the ocs to wcs transform for getting the final matrix which can be applied to the loops created from ocs coordinates?
and yes, we are calling update() on the device returned by oddbgsmanager::setuplayoutviews() and not the one which we pass in this call.
we had gotten everything to work except for the case of block reference where our amateur attempt at creating the matrix was failing as we did not have the block transform matrix.
incidently, anyone using the code i posted below should note that the order of multiplication was wrong in computation of world to screen:
code:
odgematrix3d world2screen = world2eye.postmultby(eye2screen);
it should have been pre-multiply. this had led to the problem where almost everything was correct (shear, scale, rotation), but not translation even for hatches not in block reference. but hopefully no one would need these arcane computations now!
thanks,
varun

correct usage is:
code:
odgematrix3d ocs2screenmatrix = objecttodevicematrix() * ocs2wcsmatrix;
sincerely yours,
george udov

messed up the order again.
thanks, it works!!
regards,
varun
yang686526离线中   回复时引用此帖
GDT自动化论坛(仅游客可见)