So it is known that apple's the images backing Emoji in unicode can be extracted at a resolution as high as 160x160 pixels. There is a tool available that can do this on OS X by pulling the data out of "apple color emoji" font here: https://github.com/tmm1/emoji-extractor
I am interested in extracting/utilizing apple's Emoji images in a similar way on iOS, I found the following solution and really like the implementation here: https://stackoverflow.com/a/38809531/4556704
extension String {
func image() -> UIImage {
let size = CGSize(width: 30, height: 35)
UIGraphicsBeginImageContextWithOptions(size, false, 0);
UIColor.whiteColor().set()
let rect = CGRect(origin: CGPointZero, size: size)
UIRectFill(CGRect(origin: CGPointZero, size: size))
(self as NSString).drawInRect(rect, withAttributes: [NSFontAttributeName: UIFont.systemFontOfSize(30)])
let image = UIGraphicsGetImageFromCurrentImageContext()
UIGraphicsEndImageContext()
return image
}
}
There is a problem, however, that I am running into with this implementation. If I use the string extension above, it is obvious to me that the resulting image is not 160x160 px in resolution. What I mean by that is, I am unable to produce an image of comparable quality to, say, if I were to include a 160x160 image in my project's assets.
What is the reasoning for this, and can I solve this problem without importing upwards of 1700 emoji images to my project? I figure that the font on iOS must still contain the data necessary to produce 160x160 image samples. The String extension is ideal as it allows me to reference the emoji characters more freely than if I were to simple import all these images.
I figure this may require some low level code that could be way more trouble than it's worth, but I really would love to know more about the root of this issue and possible fixes. I'm sure this is attributed to apple not rendering the emoji at a level higher than necessary, but surely there is some way to override this. Also I am aware that there are other high resolution Emoji image sets, this question is not about that.
edit:
edit 10/18/16: I've come to the conclusion that the issue can be attributed to the fact that the source image being sampled is not of the highest resolution that is assumed to be possible. It seems that the emoji is not being rendered at resolutions higher than 48x48 px, when it has been demonstrated that the emoji images can be rendered up to 160x160 px on OS X, and can be ripped from "Apple Color Emoji.ttf/.ttc" at 160x160 px. I am not sure if there even is a possible solution to this issue, as it seems like a hard limit imposed by how iOS uses unicode character codes that map to png image data.
Your issue is that you take a string of a specific font size and then you force it into an area of another specific size. The two might not match resulting in the drawn image being scaled to fit.
Either calculate the size needed for the given string and font or calculate the needed font size to allow the given string to fit into the allotted size.
Replace
UIGraphicsBeginImageContext(rect.size)
or
UIGraphicsBeginImageContextWithOptions(rect.size, false, 0.0)
with
UIGraphicsBeginImageContextWithOptions(rect.size, false, 3.0)
A scale of 2.0 also works; I can't see a difference between that and 3.0 at a quick glance.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With