Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Welcome To Ask or Share your Answers For Others

Categories

0 votes
497 views
in Technique[技术] by (71.8m points)

iphone - Filling a portion of an image with color

I'm doing an IPhone painting app. I would like to paint a particular portion of an image using touchevent to find the pixel data and then use that pixel data to paint the remaining part of the image. Using touchevent, I got the pixel value for the portion:

- (void)touchesBegan:(NSSet *)touches withEvent:(UIEvent *)event {

    UITouch *touch = [touches anyObject];
    startPoint = [[touches anyObject] locationInView:imageView];

    NSLog(@"the value of the index is %i",index);

    NSString* imageName=[NSString stringWithFormat:@"roof%i", index];

    tempColor = [[UIColor alloc] initWithPatternImage:[UIImage imageNamed:imageName]];
    lastPoint = [touch locationInView:self.view];
    lastPoint.y -= 20;
    NSString *tx = [[NSString alloc]initWithFormat:@"%.0f", lastPoint.x];
    NSString *ty = [[NSString alloc]initWithFormat:@"%.0f", lastPoint.y];

    NSLog(@"the vale of the string is %@  and %@",tx,ty);

    int ix=[tx intValue];
    int iy=[ty intValue];
    int z=1;

    NSLog(@"the vale of the string is %i  and %i and z is %i",ix,iy,z);

    [self getRGBAsFromImage:newImage atX:ix andY:iy count1:1];
}

Here I'm getting the pixel data for the image:

-(void)getRGBAsFromImage:(UIImage*)image atX:(int)xx andY:(int)yy count1:(int)count
{
    NSMutableArray *result = [NSMutableArray arrayWithCapacity:count];

    // First get the image into your data buffer
    CGImageRef imageRef = [image CGImage];
    NSUInteger width = CGImageGetWidth(imageRef);
    NSUInteger height = CGImageGetHeight(imageRef);
    CGColorSpaceRef colorSpace = CGColorSpaceCreateDeviceRGB();
    unsigned char *rawData = malloc(height * width * 4);
    NSUInteger bytesPerPixel = 4;
    NSUInteger bytesPerRow = bytesPerPixel * width;
    NSUInteger bitsPerComponent = 8;
    CGContextRef context = CGBitmapContextCreate(rawData, width, height,
                                                 bitsPerComponent, bytesPerRow, colorSpace,
                                                 kCGImageAlphaPremultipliedLast | kCGBitmapByteOrder32Big);
    CGColorSpaceRelease(colorSpace);

    CGContextDrawImage(context, CGRectMake(0, 0, width, height), imageRef);
    CGContextRelease(context);

    // Now your rawData contains the image data in the RGBA8888 pixel format.
    int byteIndex = (bytesPerRow * yy) + xx * bytesPerPixel;
    for (int ii = 0 ; ii < count ; ++ii)
    {
        CGFloat red   = (rawData[byteIndex]     * 1.0) / 255.0;
        CGFloat green = (rawData[byteIndex + 1] * 1.0) / 255.0;
        CGFloat blue  = (rawData[byteIndex + 2] * 1.0) / 255.0;
        CGFloat alpha = (rawData[byteIndex + 3] * 1.0) / 255.0;
        byteIndex += 4;
        NSLog(@"the vale of the rbg of red is %f",red);

        UIColor *acolor = [UIColor colorWithRed:red green:green blue:blue alpha:alpha];
        [result addObject:acolor];
    }

    free(rawData);
    return result;
}

Using a tolerence value I'm getting the data. Here i'm struggling to paint the remaining section.

- (BOOL)cgHitTestForArea:(CGRect)area {
    BOOL hit = FALSE;

    CGColorSpaceRef colorspace = CGColorSpaceCreateDeviceRGB();

    float areaFloat = ((area.size.width * 4) * area.size.height);
    unsigned char *bitmapData = malloc(areaFloat);    

    CGContextRef context = CGBitmapContextCreate(bitmapData,
                                                 area.size.width,
                                                 area.size.height,
                                                 8,
                                                 4*area.size.width,
                                                 colorspace,
                                                 kCGImageAlphaPremultipliedLast);
    CGContextTranslateCTM(context, -area.origin.x, -area.origin.y);
    [self.layer renderInContext:context];

    //Seek through all pixels.    
    float transparentPixels = 0;
    for (int i = 0; i < (int)areaFloat ; i += 4) {
        //Count each transparent pixel.
        if (((bitmapData[i + 3] * 1.0) / 255.0) == 0) {
            transparentPixels += 1;
        }
    }
    free(bitmapData);

    //Calculate the percentage of transparent pixels. 
    float hitTolerance = [[self.layer valueForKey:@"hitTolerance"]floatValue];

    NSLog(@"Apixels: %f hitPercent: %f",transparentPixels,(transparentPixels/areaFloat));

    if ((transparentPixels/(areaFloat/4)) < hitTolerance) {
        hit = TRUE;
    }    

    CGColorSpaceRelease(colorspace);
    CGContextRelease(context);

    return hit;    
}

Any suggestions to make this work please?

See Question&Answers more detail:os

与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome To Ask or Share your Answers For Others

1 Answer

0 votes
by (71.8m points)

First, turning a bitmap image into an NSArray of UIColor objects is nuts. Way, way too much overhead. Work with a pixelbuffer instead. Learn how to use pointers.

http://en.wikipedia.org/wiki/Flood_fill#The_algorithm provides a good overview of a few simple techniques for performing a flood-fill — using either recursion or queues.


与恶龙缠斗过久,自身亦成为恶龙;凝视深渊过久,深渊将回以凝视…
Welcome to OStack Knowledge Sharing Community for programmer and developer-Open, Learning and Share
Click Here to Ask a Question

...