For some reason, I can only display a UIImageView when I alloc/init it each iteration with a different image. What's strange is I know the image data is being loaded because I am running processing on the image and the processing is working as expected. In short, here are the two methods I was trying:
// interface
@interface ViewController : UIViewController <UIAlertViewDelegate>
{
UIImageView *imageView;
}
@property (nonatomic, retain) UIImageView *imageView;
@end
// implementation
@implementation ViewController
@synthesize imageView;
//...
- (void) loadAndDisplayImage {
// Load testing image
UIImage *testImg;
testImg = [UIImage imageNamed:@"Test.png"];
self.imageView = [[UIImageView alloc] initWithImage:testImg];
//size of imageView rect
CGRect frame = self.imageView.frame;
int ivw = frame.size.width;
int ivh = frame.size.height;
//...
}
@end
When I use this method self.imageView = [[UIImageView alloc] initWithImage:testImg];
the ivw
and ivh
have valid values and the image is displayed. However, if I change the implementation to this:
// implementation
@implementation ViewController
@synthesize imageView;
//...
- (void) viewDidLoad {
self.imageView = [[UIImageView alloc] init];
[self loadAndDisplayImage];
}
- (void) loadAndDisplayImage {
// Load testing image
UIImage *testImg;
testImg = [UIImage imageNamed:@"Test.png"];
self.imageView.image = testImg;
//size of imageView rect
CGRect frame = self.imageView.frame;
int ivw = frame.size.width;
int ivh = frame.size.height;
//...
}
@end
Where I am setting the image using self.imageView.image = testImg;
, the values ivw
and ivh
are both zero and no image is displayed but the subsequent processing on the image is still accurate. In both cases, I am sending the image to processing using [self doRecognizeImage:self.imageView.image];
. I can't figure out how this is possible. It would make a lot more sense to me if the processing failed when the image could not be shown.
Ideas? Thanks.
See Question&Answers more detail:
os 与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…