I am using the next code to add an image to a UITextView:
UITextView *textView = [[UITextView alloc] initWithFrame:CGRectMake(200,200,140,140)];
textView.font = [UIFont systemFontOfSize:20.0f];
NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"Test with emoji"];
NSTextAttachment *textAttachment = [[NSTextAttachment alloc] init];
textAttachment.image = [UIImage imageNamed:@"Angel.png"];
//for the padding inside the textView
textAttachment.image = [UIImage imageWithCGImage:textAttachment.image.CGImage scale:3.0 orientation:UIImageOrientationUp];
NSAttributedString *attrStringWithImage = [NSAttributedString attributedStringWithAttachment:textAttachment];
[attributedString replaceCharactersInRange:NSMakeRange(5, 1) withAttributedString:attrStringWithImage];
[attributedString addAttribute:NSFontAttributeName value:[UIFont systemFontOfSize:17] range:NSMakeRange(0, attributedString.length)];
textView.attributedText = attributedString;
NSLog(@"Text view: %@", textView.attributedText);
[self.view addSubview:textView];
and the result looks like this:
what I am interested in, is how can I know what picture was inserted in the text field and at witch position? I was thinking about using attributedText, as you can observe in the code, since it logs:
Text view: Test {
NSFont = "<UICTFont: 0x7ff0324f2110> font-family: \".HelveticaNeueInterface-Regular\"; font-weight: normal; font-style: normal; font-size: 17.00pt";
}{
NSAttachment = "<NSTextAttachment: 0x7ff032682bc0>";
NSFont = "<UICTFont: 0x7ff0324f2110> font-family: \".HelveticaNeueInterface-Regular\"; font-weight: normal; font-style: normal; font-size: 17.00pt";
}with emoji{
NSFont = "<UICTFont: 0x7ff0324f2110> font-family: \".HelveticaNeueInterface-Regular\"; font-weight: normal; font-style: normal; font-size: 17.00pt";
}
Update
Retrieved the image using the code:
NSMutableArray *imagesArray = [[NSMutableArray alloc] init];
[attributedString enumerateAttribute:NSAttachmentAttributeName
inRange:NSMakeRange(0, [attributedString length])
options:0
usingBlock:^(id value, NSRange range, BOOL *stop)
{
if ([value isKindOfClass:[NSTextAttachment class]])
{
NSTextAttachment *attachment = (NSTextAttachment *)value;
UIImage *image = nil;
if ([attachment image])
image = [attachment image];
else
image = [attachment imageForBounds:[attachment bounds]
textContainer:nil
characterIndex:range.location];
if (image)
[imagesArray addObject:image];
}
}];
But what if attributedString contains more than 1 consecutive photo? example:
code
NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"Test with emoji "];
[attributedString replaceCharactersInRange:NSMakeRange(4, 1) withAttributedString:attrStringWithImage];
[attributedString replaceCharactersInRange:NSMakeRange(5, 1) withAttributedString:attrStringWithImage];
log:
Image array: (
"<UIImage: 0x7fd4e3e56760>"
)
code
NSMutableAttributedString *attributedString = [[NSMutableAttributedString alloc] initWithString:@"Test with emoji "];
[attributedString replaceCharactersInRange:NSMakeRange(4, 1) withAttributedString:attrStringWithImage];
[attributedString replaceCharactersInRange:NSMakeRange(16, 1) withAttributedString:attrStringWithImage];
log
Image array: (
"<UIImage: 0x7f9ce35a4a70>",
"<UIImage: 0x7f9ce35a4a70>"
)
So, is there a bug in what I am doing or one with the enumerateAttribute method?
Update 2
Managed to fix the issue if I create a new textAttachment
and attrStringWithImage
instance for each photo I add.
Retrieving images are explained here.
Your new issue is that if two image are consecutive and the same.
So instead of:
if (image)
[imagesArray addObject:image];
You need to to other checks, this will do for two images, but you can't know if they are consecutive or not.
if (image)
{
if ([imagesArray lastObject] != image)
[imagesArray addObject:image];
}
So you need to keep references of the NSRange
too.
if (image)
{
if ([imagesArray count] > 0)
{
NSDictionary *lastFound = [imagesArray lastObject];
NSRange lastRange = [lastFound[@"range"] rangeValue];
UIImage *lastImage = lastFound[@"image"];
if (lastImage == image && lastRange.location+lastRange.length == range.location)
{ //Two images same & consecutive}
else
{
[imagesArray addObject:@{@"image":image, @"range":[NSValue valueWithRange:range]}];
}
}
else
{
[imagesArray addObject:@{@"image":image, @"range":[NSValue valueWithRange:range]}];
}
}
Retrieving only images:
NSArray *onlyImages = [imagesArray valueForKey:@"image"];
Note: I didn't check if this code compile, but you should get the whole idea.
My range calculation may be wrong (some +1/-1 missing, but nothing difficult to verify with test), and what if there is space between two same consecutive images? You may want to get the String between (NSString *stringBetween = [[attributedString string] substringWithRange:NSMakeRange(lastRange.location+lastRange.length, range.location-lastRange.location+lastRange.length)]
, and check for spaces, ponctuation characters (there are a lot of way to do it).
Additional note:
In your case, just comparing image != newImage
may be enough, but if you use web images, or even two images with different name in your bundle but that are identical, that's another issue to know if they are the same. There are a few questions on SO about comparing two images, but that should take some time/ressources.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With