Storing NSAttributedString with UIImage in Core Data
I have recently worked on solving a bug in a simple note-taking app. It allowed the user to save notes that could contain text together with images. The app was built using Core Data and there was a bug where notes that contained images were not saved properly. When the notes were later retrieved, the images were missing and only text was present.
Converting NSAttributedString to NSData and vice versa
To display the note content the app was using a UITextView and its attributedText property which is an NSAttributedString. Since Core Data doesn’t support storing an NSAttributedString, the content had to be converted to NSData before saving:
Later, that value would be converted back from NSData to NSAttributedString:
Note that during conversion the documentType has to be rtfd. It is a rich text format document that supports attachments which in our case are images.
Adding images
We will use a UIImagePickerController to pick an image from photo library and add the image to the attributed text using NSTextAttachment. When the user picks an image a delegate method in UIImagePickerControllerDelegate will get called:
Image resizing
We have to resize the image to fit the textView frame size using the custom resized(toWidth:) method on UIImage:
Creating an NSAttributedString using an encoded image
The most straightforward way of adding a UIImage to NSAttributedString is by using the NSTextAttachment:
Unfortunately, this approach causes a bug when converting the attributed string into NSData where the image gets removed during the conversion process. So I had to find another way of doing it. Fortunately, using HTML to add the image to the attributed string doesn’t cause the aforementioned bug. We only need to add the Base64 encoded image to the <img> tag:
It’s interesting to note that NSTextAttachment is used internally when the attributed string is created this way. This can be easily inspected by using the po command in the debugger.