1

In my app there is a feature where you can drag and drop a image ,image is in a UIimageView. Its a universal app.I need to store the x,y coordinates relative to 2048x1536 .I use CGPointApplyAffineTransform to calculate the relative point .And inverse the transform to iPhone screen to find the relative position on iPhone screen. Due to pixel density variation i am not getting the relative position correctly.

CGFloat TARGET_WIDTH  =2048.00;
CGFloat TARGET_HEIGHT =1536.00;



CGPoint thispoint=    CGPointApplyAffineTransform(imageView.frame.origin, CGAffineTransformMakeScale(TARGET_HEIGHT/self.view.frame.size.height, TARGET_WIDTH/self.view.frame.size.width));

This is the code i find the relative point .

And i inverse the transform with iPhone view size how do i calculate the relative with the density of pixels iPad and iPhone.

4

1 回答 1

1

测试这个,

CGFloat TARGET_WIDTH  =2048.00;
CGFloat TARGET_HEIGHT =1536.00;
CGFloat xScaleFactor = TARGET_WIDTH / self.view.frame.size.width;
CGFloat yScaleFactor = TARGET_HEIGHT / self.view.frame.size.height;
CGPoint thispoint = CGPointMake(imageView.frame.origin.x*xScaleFactor, imageView.frame.origin.y*yScaleFactor); 
//point with respect to your given target size
于 2013-03-14T09:17:23.280 回答